The Network & the Network

Full Narrative Timeline

In “The City & the City“, China Miéville imagines an usual coexistence arrangement between two cities located in the same geographical area that provides a surprisingly apt metaphor for the concept of network slicing in 5G networks — from the city & the city to the network & the network.

The two cities: Besźel and Ul Qoma occupy the same physical location, with buildings, squares, streets and parks either allocated completely to one city or “crosshatched”, that is, shared. The separation and isolation between the two cities is not ensured by physical borders, but is rather enforced by cultural customs and legal norms. The inhabitants of each city are taught from childhood to “unsee” anything that lies in the other city, consciously ignoring people, cars and buildings, even though they share the same sidewalks, roads and city blocks. Recognition of “alter” areas and citizens is made possible by the different architectures, language and clothing styles adopted in the two cities. Breaching the logical divide between Besźel and Ul Qoma by entering areas or interacting with denizens of the other city is a serious crime dealt with by a special police force. (Prospective tourists in Besźel or Ul Qoma are required to attend a long preliminary course to learn how to “unsee”.)

And now for the two networks: Experts predict an upcoming upheaval in telecommunication networks to parallel the recent revolution in computing brought on by cloudification. Just as computing and storage have become readily available on demand to individuals, companies and governments on shared cloud platforms, network slicing technologies are expected to enable the on-demand instantiation of wireless services on a common network substrate. Networking and wireless access for, say, a start-up offering IoT or vehicular communication applications, could be quickly set up on the hardware and spectrum managed by an infrastructure provider. Each service would run its own network on the same physical infrastructure but on logically separated slices — the packets and signals of one slice “unseeing” those of the other. In keeping with the metaphor, ensuring the isolation and security of the coexisting slices is among the key challenges facing this potentially revolutionary technology.

 

The Rebirth of Expertise?

BaltesThese days, conversations on almost any topic — be it finance, health care, art, the economy, music, or even religion — do not seem complete without a lively, and more or less informed, exchange on AI and on machine learning. The crux of the discussion typically rests on the role of humans in the increasingly large number of enterprises that depend on machines for decision making and manufacturing. In this context, a distinction that may prove useful in thinking about a future society of humans and “intelligent” machines is that proposed back in the 60s in the field of psychology between fluid and crystallized intelligence. As recently pointed out by Sarah Harper, taken to its logical end point, this idea may yield some possibly counter-intuitive conclusions regarding the parts to be played by AI and by different generations in the workplace.

Fluid intelligence relates to the ability to solve new problems by applying well-defined logical rules, such as by means of inductive or deductive reasoning. Fluid intelligence does not depend on any external prior knowledge about the world and the problem domain. In contrast, crystallized intelligence is the capacity to build on one’s experience and knowledge to acquire new skills and to solve problems.

In humans, fluid intelligence tends to decrease with age, while crystallized intelligence follows an inverse trend, peaking much later in life. Machines appear to have surpassed humans in terms of fluid intelligence, given their unprecedented capability to recognize patterns in large volumes of data and to optimize actions over long time horizons. But building general-purpose skills based on expertise in a computer, that is, generating artificial crystallized intelligence, is broadly considered to be unattainable with current AI techniques (listen to Obama’s eloquent explanation of this point!). Current state-of-the-art machine learning methods in fact cannot even explain why they output given decisions.

So there you have it — in a system that can leverage the fluid intelligence of sophisticated AI tools, the crystallized intelligence borne out of the experience of older women or men may become more valuable than the speed and flexibility of fresh graduates. Considering the predictions of an increased lifespan, this sounds like good news — can it be that expertise is not dead  after all?

Net Neutrality vs Net Vitality (and 5G)

simpledesktops.com.pngA prime example of the complex relationship between digital technologies and the legal system is the fluidity and geographical variance of the laws that regulate broadband access. The discussion is typically framed — as far as I can tell from my outsider’s perspective — around two absolute principles, namely network neutrality and networks vitality. The net neutrality and net vitality camps, at least in their purest expressions, often seem uninterested in hearing each other’s arguments. This tends to hide from public discussion the layered technological, economic, moral and legal aspects that underlie the delicate balance between access and economic incentives that is at the core of the issue. And things appear to be getting even more involved with the advent of 5G.

Net neutrality is — for purists — the principle that all bits are created equal. Accordingly, broadband access providers should not be allowed to “throttle” packets on the basis, for instance, of their application (e.g., BitTorrent) or their origin (as determined by the IP address). The network should be “dumb” and only convey bits from two ends of a communication session. Regulation that upholds net neutrality rules is in place in many countries around the word, including in the EU and the US. Under the previous US administration, the FCC reclassified broadband Internet access as a “common carrier”, that is, as a public utility, in 2015, allowing the enforcement of net neutrality rules. Under the new administration, this decision now appears likely to be reversed.

The counterarguments to net neutrality typically center around some notion of net vitality, which refers broadly to the dynamism of the broadband Internet ecosystem, particularly as it pertains investment and growth. The term was coined in a report by the Media Institute, where a quantitative index was proposed as a compound measure of the net vitality of a country in terms of applications and content (e.g., access, e-government, social network penetration, app development),  devices (e.g., smart phone penetration and sales), networks (e.g., cybersecurity, investment, broadband  prices), and macroeconomic factors (e.g., number and evaluation of start-ups).

Net neutrality purists — not all advocates fall in this category — believe that allowing broadband access providers to discriminate on the basis of a packet’s identity would pose a threat to freedom of expression and competition. Without net neutrality rules, telecom operators could in fact block competitors’ services, and also favor deep-pocketed internet companies, such as the Frightful Five (Alphabet, Amazon, Apple, Facebook and Microsoft), that can outspend start-ups for faster access. A case in point is the ban of Google Wallet by Verizon Wireless, AT&T, and T-Mobile to promote their competing Isis (!) mobile payment system.

The net vitality camp, headed by broadband access providers and economists, deems net neutrality rules to be an impediment to investment and growth. As claimed in a 2016 manifesto by European telecom operators, only by charging more for better service can sufficient revenue be raised by broadband access providers to fund new infrastructure and services.

Digging a little deeper, one finds that the issue is more complex than implied by the arguments of the two camps. To start, some discrimination among the bits carried by the network may in fact serve a useful purpose. For instance, by letting some packets be transported for free, telecom operators can offer zero-cost Internet access to the poorest communities in the developing world  as in the Facebook Zero and Google Free Zone projects. And packet prioritization is in fact already implemented in LTE networks as a necessary means to ensure call quality for Voice over LTE (VoLTE is not considered to be a broadband Internet service and hence not subject to net neutrality regulations).

That net neutrality is a more subtle requirement that the “every bit is created equal” mantra is in fact well recognized by many net neutrality advocates. When making the case for net neutrality rules, the then-president Obama called for “no blocking, no throttling, no special treatment at interconnections, and no paid prioritization to speed content transmission”, hence stopping short of prescribing full bit equivalence. Tim Berners-Lee and Eric Shmidt have also voiced similar opinions.

The planned transition to 5G systems is bound to add a further layer of complexity to the relationship between net neutrality and net vitality. 5G networks are indeed expected not only to provide broadband access, but also to serve vertical industries through the deployment of ultra-reliable and low-latency communication services. In this context, it seems apparent that bits carrying information about, say, a remote surgery or the control of a vehicle, should not be treated in the same way as bits encoding an email.

As the example of VoLTE shows, a general solution may lie in isolating mobile broadband services, on which strong net neutrality guarantees can be enforced, from other types of traffic, such as ultra-reliable and machine-type communications, on which traffic differentiation may be allowed. The feasibility of this approach is reinforced by the fact that isolation is a central feature of network slicing, a technology that will allow operators of 5G to create virtual networks that are fine-tuned for specific applications.

 

In the Loop (On Technology and Politics)

9193ghdcixl

The feedback loop between politics and technology appears to be one of the most potent motors of human history, its pace quickening with each innovation cycle.

In the antiquity, the Greek city states and the Persian and Roman Empires created the necessary conditions and incentives for mathematicians and engineers to pursue specific technologies, such as aqueducts and catapults. Later, the feedback loop began closing, with technological advances directly informing the political system — in Marx’s words:

“The hand-mill gives you society with the feudal lord; the steam-mill, society with the industrial capitalist.”

Fast-forwarding to today, much has been written about the way in which new communication tools have enabled the resurgence of popular and populist movements in the Middle East, Europe and the USA. It is also often argued that it is politics that is bound to play catch-up in this process, with technological firms leading the way towards decentralization and acceleration.

But, in light of recent political developments around immigration, the feedback loop appears to be promptly working its way in the reverse direction. Politics, in fact, seem bound to dislodge the barycenter of technological innovation from its current established poles. Technology has thrived in environments characterized by diversity and mobility. Financial tech firms have found fertile ground in London and Internet companies have taken over Silicon Valley. But the threat, even if not yet directly enforced, of new walls and restrictions in the UK and USA is pushing gifted PhD students and Post-Docs to choose alternative destinations, such as Canada, New Zealand, Mexico and Singapore.

Looking further ahead in the future, to update Marx’s quote, 3D printing may give us a post-scarcity and post-work society. As imagined by Cory Doctorow in “Walkaway“, if tools such as 3D printers fulfill their promises, objects and foodstuffs will be just a few clicks away, produced with close to zero cost by repurposing unused materials and by transforming bits into things. In an economy of abundance, there would be no need to keep jobs, maintain a currency, protect private property or support the current education system. This would strip the nation state of its main functions and usher in new ways of organizing societies — and, inevitably, new technologies.

Demain

Tomorrow.jpg

Having read yet another discouraging article on the state of our planet, a group of French filmmakers embarked on an optimistic globe-trotting quest for climate change solutions that took them to the UK, USA, Denmark, France, Switzerland, Finland and India. The result is a 2015 documentary entitled “Tomorrow” that is now playing in US theaters. The movie is meant as an antidote to the fatalism that can stem from familiarity with the scientific consensus on global warming.

Shot at a time in which prospects were not as dire as they appear today after the latest policy shifts in the United States, the documentary finds hope in innovative sustainable approaches to agriculture, energy production, finance, democracy and education. A common underlying element of  all the surveyed solutions is their reliance on social, local and decentralized mechanisms. Including the inevitable interview with Vandana Shiva and an expected visit to a Finnish elementary school, the film uncovers a heartening set of initiatives, such as permaculture and the adoption of local currencies alongside conventional government-backed money.

Conspicuously missing from any of the solutions discussed in the movie are information and communication technologies (excluding the fleeting appearance of a smartphone used to pay in a local currency). This may not come as a surprise given the measurable decrease in “closeness, connection, and conversation quality” among people in local communities that has resulted from the widespread adoption of mobile phones. A city of the future ruled by smart devices seems indeed destined to be a lonely place, incompatible with the development of meaningful social programs. If we also factor in the energy footprint of producing digital devices and running telecom networks, the case for considering communication technology as a contributor to climate change appears to be well motivated.

Nonetheless, communication technology has a potentially important role to play in combating global warming. Smart phones are already used for emergency preparedness and coordination to respond to the effects of a changing climate. And, as discussed in a recent report of the Brookings Institution, the upcoming fifth generation of wireless networks may prove to be a significant asset in key areas such as water management, air quality control, energy production, transportation, and building design.

Take water management. It has been reported that two thirds of the world’s population may face water shortages by 2025. Thanks to Internet-of-Things (IoT)-enabled sensor and actuator networks, the efficiency in the use of this scarce natural resource may be improved via monitoring (e.g., of the concentration of dangerous chemicals), leakage detection, measuring home usage, adaptive irrigation tailored to measured moisture levels, and smart chillers for industries.

Leveraging the benefits of connectedness for a more efficient use of natural resources while at the same time building more resilient and sustainable communities may prove a delicate balancing act, but one that could prove critical for the future of our planet.

Internet of Thoughts

1999_1Brain-computer interfaces have been a staple of cyberpunk plots for decades. They have also been the subject of serious scientific research since the 70s, leading to impressive recent prototypes that allow humans to remotely control artificial limbs through their thoughts.

In the last few days, two headline-grabbing announcements appear to presage an era in which brain-computer interfaces will be standard components in commercial communication devices. First, it was reported that Elon Musk  launched a company that will invest in research to make tools “that may one day upload and download thoughts”. Other companies are also in the works that have similar goals. And then Facebook revealed its plans to develop technology that would make it possible for users to compose outgoing messages via their thoughts and to “feel” incoming messages without reading them.

In this plausible scenario in which brain-computer interfaces are integrated into communication devices, humans will be able to literally communicate through their minds. We would therefore experience a transition from the upcoming Internet of Things to a next generations of communication networks supporting an Internet of Thoughts.

This idea was fictionalized in “Lock In“, a 2014 novel by John Scalzi. In Scalzi’s world, massive funds are allocated by the government on brain-computer interface research in the aftermath of an epidemic that left millions of Americans locked in, that is, unable to move and communicate. The plot revolves around the fact that networked elements are prone to hacking given their reliance on software (not unlike “Ghost in the Shell“). Hacking a brain, it turns out, may have quite unpleasant consequences — and not only for the hacked.

It has been reported that even the most experienced software programmers have a rate of error of 0.05%, so that, on average, programs have an error every 2,000 lines. This implies that there are thousands of bugs in a typical modern application, since, for instance, Android has 12 million lines of codes. The fact that only one error may be enough to compromise the security of a system highlights the significant challenges of securing a software-based networks from hacking.

The ongoing softwarization of everything from computing to telecommunication networks may well one day (perhaps not too soon) extend to our minds. One can only hope that breakthroughs in software security will outpace threats from hackers as this process unfolds.

Information Without Representation

Shannon’s information theory (IT) provides a rigorous framework to quantify the amount of information that a system Y has about a system X, irrespective of what the information is about.  IT metrics count the (logarithm of the) number of different states of system X that system Y can distinguish based on Y’s available information (or, conversely, the missing information on X’s state). This information can be translated into a binary file that represents the information that Y has on the state of X.

Shannon’s information measures hence concern the representation of information, while purposefully disregarding its significance to the recipient. In IT, the information system Y has regarding system X is the same whether or not Y can extract value or meaning from its ability to differentiate among some of the states of X.  In cognitive sciences, instead, the focus is on semantic information, that is, on the meaning that the information carries for the receiver.

wilson
Robert Anton Wilson proposed to measure the amount of scientific information (a subset of semantic information) by adopting as a unit of measure the amount of scientific information known during the lifetime of Jesus.

To use D. M. MacKay‘s words, semantic information is “difference that makes a difference” for the receiver. Accordingly, not all distinguishable states contribute to the amount of semantic information. Daniel Dennet in his new book gives a recursive definition of semantic information as “design worth getting“. Design refers to the use of “semantic information to improve the prospects of something by adjusting its parts in some appropriate way“. In other words, semantic information is defined by the fact that it can be leveraged to determine the form, or the design, of something for the benefit of the recipient. Semantic information hence depends on the receiver, and it need not be represented or saved to have an impact on the receiver’s design. Semantic information is valuable, and misinformation and disinformation are its perversions.

Information, Knowledge, Wisdom and 5G

bestdoc-535x300One of the most compelling conceptual visions for 5G contrasts the user-driven information-centric operation of previous generations with the industry-driven knowledge-centric nature of the upcoming fifth generation. According to this vision, the evolution from 1G to 4G has been marked by the goal of enhancing the efficiency of human communication — with end results that we are still trying to understand and manage. In contrast, 5G will not be aimed at channeling tweets or instantaneous messages for human-to-human communications, but at transferring actionable knowledge for vertical markets catering to the healthcare, transportation, agriculture, manufacturing, education, automation, service and entertainment industries. In other words, rather than carrying only information, future networks will carry knowledge and skills. Whose knowledge and whose skills will be amplified and shared by the 5G network infrastructure?

Two options are typically invoked: learning machines (AI) and human experts. AI is widely assumed to be able to produce actionable knowledge from large data sets solely for tasks that require systematic, possibly real-time, pattern recognition and search operations. Typical examples pertain the realm of the Internet of Thing, with data acquired by sensors feeding control or diagnostic mechanisms. AI is, however, still very far from replicating the skills of human experts when it comes to “instinctive intelligence“, making multi-faceted judgements  based on acquired “wisdom“, innovation, relating to other humans, providing advice, offering arguments, and, more generally, performing complex non-mechanical tasks. Therefore, human experts can complement the knowledge and skills offered by AI. A scenario that is consistently summoned is that of a surgeon operating on a patient remotely thanks to sensors, haptic devices and low-latency communication networks.

By sharing knowledge and skills of AI and human experts, 5G networks are bound to increase the efficiency and productivity of learning machines and top professionals, revolutionizing, e.g., hospitals, transport networks and agriculture. But, as a result, 5G is also likely to become a contributor to the reduction of blue– and white-collar jobs and to the widening income gap between an educated elite and the rest. This effect may be somewhat mitigated if more optimistic visions of a post-capitalist economic system, based on sharing and collaborative commons, will be at least partly realized thanks to the communication substrate brought by 5G.

Tomorrow Never Knows

harari_podcast_snippet

It is unquestionably a time of uncertainty, and visions of the not-too-far future come in all flavors. Here is a partial list:

  • Friedmans, or techno-optimists: New technologies for communication, computation, energy production and transportation will raise productivity and reduce costs, breaking the logic of scarcity and ushering in a new economic system based on sharing and cooperation;
  • Merkels, or business-as-usuals: Capitalism, liberalism and democracy will naturally prevail and progress will continue unimpeded;
  • Mandibles, or stone-agers: The over-reliance on networked devices will lead to a catastrophic breakdown of the communication and financial infrastructures as a result of cyber-wars, making advanced economies unable to provide for themselves;
  • Hararis, or techno-pessimists: Intelligent machines will take over jobs and functions of “regular” humans, and a new minority of super-human cyborgs will emerge beyond the point of technological singularity;
  • Realists, or climate catastrophists: There is really no future for humankind on Earth;
  • Cixins, or escapists: The future for humanity is in space — at least if you can pay the ticket.

It from Bit

6261055049_26244e9348_bIn most classes on information theory (IT), the relationship between IT and physics is reduced to a remark on the origin of the term “entropy” in Boltzmann’s classical work on thermodynamics. This is possibly accompanied by the anecdote regarding von Neuman’s quip on the advantages of using this terminology. Even leaving aside recent, disputed, attempts, such as constructor theory (see here) and integrated information theory (see here), to use concepts from IT as foundations for new theories of the physical world, it seems useful to provide at least a glimpse of the role of IT in more mainstream discussions on the future of theoretical physics.

As I am admittedly not qualified to provide an original take on this topic, I will rely here on the poetic tour of modern physics by Carlo Rovelli, in which one of the last chapters is tellingly centered on the subject of “information”. Rovelli starts his discussion by describing information as a “specter” that is haunting theoretical physics, arousing at the same time enthusiasm and confusion. He goes on to say that many scientists suspect that the concept of information may be essential to make progress in theoretical physics, providing the correct language to describe reality.

At a fundamental level, information refers to a correlation between the states of two physical systems. A physical system, e.g., one’s brain, has information about another physical system, e.g., a tea cup, if the state of the tea cup is not independent of that of the neurons in the brain. This happens if a state of the tea cup, say that of being hot, is only compatible with a subset of states of the brain, namely those in which the brain has memorized the information that the tea cup is hot. Reality can be defined by the network of such correlations among physical systems. In fact, nature has evolved so as to manage these correlations in the most efficiency way, e.g., through genes, nerves, languages.

The description of information in terms of correlation between the states of physical systems is valid in both classical and quantum physics. In thermodynamics, the missing information about the microstate of a system, e.g., about the arrangement of the atoms of a tea cup, given the observation of its macrostate, e.g., its temperature, plays a key role in predicting the future behavior of the system. This missing information is referred to as entropy. In more detail, the entropy is the logarithm of the number of microstates that are compatible with a given macrostate. The entropy tends to increase in an isolated system, as information cannot materialize out of thin air and the amount of missing information can only grow larger in the absence of external interventions.

In quantum physics, as summarized by Wheeler’s “It from Bit” slogan, the entire theoretical framework can be largely built around two information-centric postulates: 1) In any system, the “relevant” information that can be extracted so as to make predictions about the future is finite; 2) Additional information can always be obtained from a system, possibly making irrelevant previously extracted information (to satisfy the first postulate).

The enthusiasm and confusion aroused by the concept of information among theoretical physicists pertain many fundamental open questions, such as: What happens to the missing information trapped in a black hole when the latter evaporates? Can time be described, as suggested by Rovelli, as “information we don’t have”? Related questions abound also in other scientific fields, such as biology and neuroscience: How is information encoded in genes? What is the neural code used by the brain to encode and process information?