Going from device-centered to human-centered technology
In 6G research, the year 2030 gets touted around a lot.
By that time–still futuristic but ever approaching–the world will be a very different place, with ubiquitous, reliable and real-time connectivity and the planet will be the better for it. ”Maybe,” says Dr. Ian Oppermann, New South Wales Government Chief Data Scientist and Industry Professor at University of Technology Sydney.
Obviously, Oppermann is all for digitalization and technological advancement, and in fact thinks there is no alternative path for us, if we are to survive as a species. However, there are important decisions to be made and serious conversations to be had about things like data, consent, privacy and trust.
”The big difference in the vision for 6G when compared with 4G or 5G is the movement away from being device-centered to being human-centered,” Oppermann says. ”People are put in the center both as sources of data as well as users of data. At the moment we get to decide what kind of world we want for us in the future, a world where we inevitably will be revealing more and more data about ourselves.”
Already, we are creating very rich digital footprints as we use our devices. We constantly use sophisticated technology, casually exchanging information about ourselves for personalized services, the problems of which we are just working out. It is very easy to imagine benign applications now and in the future with, say, zero energy devices woven into our clothes, but it is getting increasingly more difficult to imagine every scenario where our personal information might be useful.
”It all comes down to the increasing number of ways we generate and use data. Think of a smart home, where the lights turn on and off as you move from room to room, where the heating is controlled intelligently by the number of people at home. That’s already divulging your exact location and that you are with someone else,” Oppermann describes.
”Imagine you have a smart toilet that analyzes your urine chemistry and gives you recommendations for what to eat, based on your phosphate levels. Maybe that information gets shared with your fridge and it suggests you should eat more bananas. There are many questions about this scenario, beginning with ‘Can I trust this service?’ And ‘Who is responsible if this is not good advice?’”
“Trust is a must in 6G”
Another convenient piece of technology might be a drone hovering above your home, providing you with an ad hoc mobile network (great), but in addition the drone can record your location (dubious, but OK) and perhaps measure your body temperature (definitely not OK). The obvious question is, do you consent to all of this? While it’s easy to say that one wouldn’t consent to giving away personal vital signs, many of us already do with devices intended to track our health. More than that, the little bits of data we give out might be connected later down the road to satisfy new uses, ones we may or may not like.
”The more devices we connect with, in clothing, in shoes, in our cars and homes, the more difficult it becomes to give genuine and meaningful consent. End user agreements are alrady ineffective and confusing. We can’t keep up with what’s already in place, not to mention in the future, and so consent and privacy as we’ve come understand them are increasingly outdated concepts. Society must engage with these issues as individually it’s very hard to take them on,” Oppermann says.
It is also very easy to imagine malign applications and services intended to restrict and harm human beings, like state surveillance to control citizens, or terrorist attacks designed to degrade or downright destroy systems deployed in societies, or cybercriminals threatening to release sensitive information about individuals unless they pay a ransom. As the COVID-19 pandemic was unleashed on the world and people–including politicians, military personnel and corporate staff–started working from home, remote presence technology suddenly created a huge number of new cyber attack surfaces. And as we put zero energy devices into building materials or clothing and have everything talk to everything else, the vulnerabilities will only increase.
But, even if we may have little chance of imagining all the consequences of having data harvested and utilized at this exponential rate, we do have the possibility to lay down rules, says Oppermann.
”Civil democratic society can say, look, here’s a set of principles that everyone must abide by, like you must never reveal the location of a child, or divulge anyone’s religious or sexual preference and so on. However technology is employed, these are the principles that must be followed. But we must be forward-looking and anticipatory in our conversations about society.”
The challenges in creating a trustworthy 6G are multidisciplinary, spanning technology, regulation, techno-economics, politics and ethics. ”Trust is a must in 6G”, agrees associate professor Mika Ylianttila who leads a research group focusing on network security and softwarization at the Centre for Wireless Communications.
”There are many examples of popular applications people download to their mobile phones, which they cannot truly trust, and where all their personal data can end up when they give their consent in order to use the application. Typically the user data is uploaded to a cloud service, and stored and processed further and in many cases sold to third parties, of which the users may not have the slightest idea. Without more sophisticated ways of ensuring trust, security and privacy, there will be even more challenges in the future, as more services and technologies emerge where user data can be collected and utilized. Distributed ledger and blockchain technologies are one potential way to increase trust in the network, but also other considerations are required”, Ylianttila explains.
A human in the loop
How to keep up with development, then? The answer is putting people in the center of technology, or insuring there’s always a human in the loop–if we’re smart. Artificial intelligence is already doing the easy stuff for us automatically and moving to more and more complicated tasks, but we need to decide what technology can and should do for us.
”For instance, deploying lethal autonomous weapons is an extreme example of automated technology. Another would be the automatic deployment of a COVID-19 vaccine that AI has deemed sufficiently effective. We obviously have to have standards to deal with issues like these and others,” Oppermann says.
Oppermann stresses that the conversations around data, technology and security are urgent. We have a chance to describe the world we want to live in, but we can’t choose all the good stuff and ignore the bad. 2030 isn’t the end point: we will have an extra billion people on the planet during that decade, and in the 2050’s another billion people on top of that. Meanwhile, we need to figure out how to feed everyone and how to, say, fight off infection as we are running out of effective antibiotics. To think that we could revert to a life where people won’t survive minor cuts is horrifying, but something we absolutely need to think about, says Oppermann.
”We can model RNA interactions and gene folding and we can do some things that are pretty damn amazing. The important thing is to realize is that we don’t do them because they are cool. We do them because we have to. We must rely on digital services that are increasingly vulnerable and that is why we need some pretty good standards,” Oppermann says.
Not because it’s interesting, not because it’s cool: because it’s survival.
Read more about the subject:
6G White Paper: Research Challenges for Trust, Security and Privacy
Text: Janne-Pekka Manninen