Speaking as patients, clinicians, digital transformers of social care, and health tech innovators — they helped us focus our thoughts on why we’re here: to make sure patients and staff have the tech they need.
We have lots left to do, so @NHSX can play the role we know it needs to play. But we took a big step today, launching the centre’s coordinated team to drive transformation. Looking fwd to working with many of you in local health & care, to help maximise your impact too.
A big part of our job as the TAG is to help spec authors with their ideas for a new feature on the web. We help them think about things like how their proposal could work with other features, whether it might have unintended consequences, and how they can learn from someone else’s similar experience.
A lot of our advice in these design reviews comes from our personal experiences, but we also try to point to existing documents to help explain.
The more we’ve talked about the logic under our advice, the more we’ve realized how much these ethical principles are at the heart of how the web works — and therefore is some of our most important advice to spec authors. And they hadn’t been written down, which is why we wrote this finding, W3C TAG Ethical Web Principles.
The principles themselves aren’t technical, but they have substantial technical implications. And we have tried to make those implications clear. For example:
Principle: The web is for all peopleExplanation: We will build internationalization and localization capabilities into our specs and websites. We must make our websites accessible for people with disabilities. Accessibility involves a wide range of disabilities, including visual, auditory, physical, speech, cognitive, language, learning, and neurological disabilities. We must build for users on low bandwidth networks and low specification equipment.
This work was sparked by this blog post by our co-chair Dan Appelquist, reacting to what he called “[Our] anxiety about the current state of the world and the role the web has unwitting played in making it that way. The misuse of social media to control public opinion through the spread of propaganda, bot-enabled harassment campaigns and over-reliance on biased and simplistic algorithms for content promotion are some of the unexpected consequences of a world wide ‘web of information nodes in which the user can browse at will.’”
In that light, we thought it was especially important to highlight our ethical responsibilities as web and platform developers. This finding brings together those traditions and philosophies that underpin how the web works.
The digital transformation of the NHS is the most important change we can make to the system. It will give us data to boost cancer survival rates; join up a patient’s journey across health and social care; give clinicians the tools they need so they can focus on the patient.
To do this we need a transformation on how tech is led in the NHS. The responsibility for tech, digital and data policy was split between numerous orgs and teams. We’ve all spent too much energy managing this complexity.
Voice recognition tools are fun… but they aren’t transformative on their own. They have little practical value in the NHS until clinicians can routinely use one to accurately transcribe consultations and streamline administrative processes. As a patient, accessing and contributing to your health records on your mobile phone is most useful when they are complete — when your prescriptions, GP and hospital test results are all in one place.
Research on clinical data is most effective when you broaden the sample — when you can train your models on the most representative group of subjects possible.
We are beginning with interoperability standards: the rules that we all agree so that a blood pressure reading in one system is recognisable to another system. We also need a network stack that allows the data from one hospital to reach another, or the patient, when it’s safe and appropriate to ask for it. We need strong encryption algorithms and security practices to keep it safe.
On this architecture we can build transformative digital services that work anywhere in the NHS. We can give our clinicians simple, intuitive ways of inputting and working with their patients’ records. We can supply our patients with better information with which to manage their own health, and we can empower our country’s best developers and entrepreneurs with the details they need to help us build the most advanced health and care system in the world.
This architecture will allow us to plug in the best technologies like those voice recognition tools, which is where life on the ground gets demonstrably better. This will put us in good shape to innovate, now and in the future.
The right use cases
We need your help to make sure we are choosing the best use cases to standardise. We want to create a set of rules that encourage innovation and creativity, rather than creating a new bottleneck. And we want to converge around the open standards that give us all the most freedom in how we meet our user needs.
All these structures, tools and ideas are interdependent, and overwhelming. So where to start? We think concrete use cases are important, but so are good foundations. We are doing all this for real. Your feedback on sequencing matters. We are keen to hear from you, if there are quick wins, or important foundations that we might miss, that we should attend to right now. So get in touch, and tell us what you like, and what you would like changed.
You know those AMP URLs you get from Google search results and which often pop up on Twitter?
Instead of https://www.rt.com/sport/… you’ll get https://www.google.co.uk/amp/s/www.rt.com/document/…
What you’re seeing is Google’s AMP project hosting content for Russia Today. This lets Google load the page during the search results, so that when you click on the link on the search page, the text appears immediately. (This is solving a big problem, by the way. That shorter loading time can make the web a far more enjoyable experience.)
Facebook’s Instant Articles and Apple News operate similarly but without the benefit of being on the web or using real URLs — a much worse starting point.
The web relies heavily on the “origin policy”, which amongst other things, helps browsers manage permissions (e.g., access to your location, camera, microphone, etc.), attribute bad actions (phishing attacks), and assist you with things like passwords and filling out forms. This core aspect of web architecture ties permissions and security settings to a particular origin, like rt.com. Distributing or syndicating content removes that context by hosting one site’s content within a different site, which can confuse users and stop browsers from keeping the web safe.
In the W3C Technical Architecture Group we have been thinking about this issue. While we understand the value these approaches provide, they also pose serious issues. Fundamentally, we think that it’s crucial to the web ecosystem for you to understand where content comes from and for the browser to protect you from harm. We are seriously concerned about publication strategies that undermine them.
We have published this finding to explain our thoughts in more detail.
One reason we form governments is to protect our communities. At the same time, our economy and human rights depend on private and encrypted online services. How do we move forward when these two agendas clash?
What’s prompting this post
Following this week’s explosion in the Manchester Arena, we in the UK are struggling to come to terms with the loss of children, the unsettling reminders of our vulnerability, and the stark contrast in our communities coming together in the aftermath.
We are having the to-be-expected conversations about why this happened, what we can learn, and how we protect ourselves. We are reexamining what we expect of our government. It’s part of how we heal as a country, how we pick ourselves back up.
Some of the discussion inevitably turns to encryption and how terror plots are organised — in the UK, abroad; face to face, over the internet. Quickly we run into the encryption question: end-to-end encrypted services can’t be decrypted in between the users’ devices, which makes it difficult for authorities to identify a conspiracy.
“It used to be that people would steam open envelopes, or just listen in on phones, when they wanted to find out what people were doing, legally, through warrantry — but in this situation we need to make sure that our intelligences services have the ability to get into situations like encrypted WhatsApp.”
It feels like a discussion at a stalemate; I’m seeing government asking for the problem to be solved, and technologists rolling their eyes at the implications that “government wants to outlaw maths.”
Having been on both sides of this discussion, I want to explain the miscommunications I see happening and outline the (few) options I think we have to proceed.
The source of the conflict
There are two conflicting pressures pushing us towards this impasse.
Problem 1: The democracy problem
In the UK, we ask (and pay our taxes for) our government to keep us safe. We expect it to be in every party manifesto on which we elect the next government. We authorise it through a large percentage of our government’s budget. We, often through our press, actively get upset when our government doesn’t keep us safe, and we launch inquiries and hold leaders accountable when they fail.
Our police and national security machinery are constantly trying to keep up with the changing ways criminals act. The rise in end-to-end encryption on messaging services has complicated their jobs — and when they hear us asking to be kept safe, they have pointed to this as an obstacle.
So they’re asking us as the tech industry to “fix it”. If we don’t, they can’t do their jobs properly — which is what we, as citizens, have asked them to do.
Problem 2: The technology problem
In a completely different vein, the we — the tech community — are building an internet on which our society and economy can flourish. We are fighting a whole industry of criminals who are trying to undermine this — as we all know, we need to protect ourselves against phishing, malware, unauthorised intrusions, man-in-the-middle attacks… Our infrastructure is vulnerable in a lot of ways. As I’m fond of repeating, we initially set up the protocols in the internet and web stacks to optimise for sharing — we’re only recently retrofitting security to it.
You know those old browsers in TVs, exercise bikes, kiosks and the like that can’t browse the web anymore? Have you ever noticed how strange it is that they become dusty and increasingly hard to use, when the browsers in your mobile phone or laptop carry on very well?
It happens because no one keeps them up to date. As web technologies (and therefore, websites) evolve around them, they get further away from being able to handle what a site serves them. And as a result, they become increasingly less useful.
Constant evolution is fundamental to the Web’s usefulness. Browsers that do not stay up-to-date place stress on the ecosystem. These products potentially fork the web, isolating their users and developers from the rest of the world.
Browsers are a part of the web and therefore they must be continually updated. Vendors that ship browsers hold the power to keep the web moving forwards as a platform, or to hold it back.
I gave the opening keynote at OvertheAir yesterday, covering President-elect Trump, Brexit and what it all means for us as developers. Topics like:
Data protection laws. Will your app from London be able to handle users in another country? Or will you need to do something special to be compliant with their laws? Will it matter where you host data about your users?
The importance of informed and empowered users. We need to build services that make clear what data is going where. And I think we REALLY need to standardise private browsing mode. Everyone should know what it does when they turn it on… but it varies widely from browser to browser!
Keeping transactions secure. If everything depends on economic growth, and economic growth depends on secure, reliable transactions… Security and strong encryption will be crucial to our future.
Fake news. We (the web community) — well, we didn’t invent fake news. But we did create ways for it to be distributed on a mass scale. Therefore, we have some responsibility here — we should work towards fixing it.
There is still a lot that isn’t settled, on the political/governmental fronts, but it’s useful to keep an eye on the facts we have and the questions we’ll need answers to as things unfold. Lots to do ahead — and lots to build.