Will concerns about data hinder progress with future ITS systems deployments?
“What are the privacy and ethical implications that surround the use of data in the mobility sector?
The volume of information generated by Intelligent Transport Systems (ITS) and vehicles is likely to increase exponentially, especially following the introduction of Connected Autonomous Vehicles (CAV), mainly caused by the enormous amounts of data which these are likely to transact on a daily basis.
So, how will the digital footprint that follows users across transport modes be protected to respect their privacy and ensure that they are not disadvantaged due to prejudice or commercial interest?
Most of us are unaware of the information which already follows our every move, generated by our vehicles and services we use. Our existing cars already produce streams of data about our habits whilst driving. The various processor units can be forensically examined to reveal all sorts of parameters about vehicle settings and driving history. Our mobility apps also have an insight into our personal and business trips, with records of where and when we have used services such as Uber or Lyft. Although most users do appreciate that our mobile (cell) phones provide a picture of our daily lives, there is a lack of understanding of the extent of information which they can gather. With their inbuilt sensor capabilities, smart phones continually record a vast range of parameters, so that now my phone probably has a better insight of my daily routine and habits than I do, across a diverse range of activities, including where I walk my dog, where I go shopping or socialising, when and where I work, when and where I travel, and so on. Then if that isn’t enough, I upload pictures and posts of where I am and who I’m with on social media. So, should we be concerned about the way in which the mobility sector could encroach on our privacy as technologies are developed?
In short, yes, I think we should all be concerned. Even if we conduct ourselves properly, both morally and legally, there are questions regarding the abuse of data mining by both commercial and government actors. There is still time and opportunities to deal with these concerns, although governments and other authority organisations do not seem to be dealing with the issues that surround this subject in an encompassing manner. This is largely because the various government agencies dealing with vehicle or infrastructure governance, concentrating on the remit given to them many years ago, are often focused on safety or compliance with physical standards. There is a need in many jurisdictions to deal with these issues holistically, not to just rely on the vehicle manufacturer or the communications provider to offer an ‘opt-in’ clause to their mobility agreement, about using or sharing data.
From a moral perspective, in addition to the issues which surround the use of gathered data, another concern is the way in which systems could then profile the poor or members of marginalised communities, resulting in them being disadvantaged. This could be purely on the testimony of their navigation system or web-browsing history, revealing that they go to bars or fast food restaurants a lot, or that they live in what is categorised as a deprived area, is obviously indefensible. With the advent of Artificial Intelligence (AI), the way in which vehicles are programmed to react to incidents or how traffic systems treat users with particular socio-economic profiles, could lead to inappropriate decisions being made. For example, during an incident which might result in an injury, how should a system choose between a subscriber to its service and a bystander, would it factor into the decision-making process who pays a monthly fee?
The good thing is that many governments do have legal requirements for companies to declare that they gather personal data, require users to agree to it, ensure that it is only used for the purpose for which it was gathered and to securely destroy it when the need for its retention is exhausted. In the UK, the General Data Protection Regulation (GDPR) controls and provide users with rights, including full details of data held about them. Nevertheless, the practice of ‘offshoring’ these activities, particularly by the larger tech companies, can erode the level of protection given to personal data. Other systems such as Bluetooth or Automatic Number Plate Recognition (ANPR) based average journey time systems cannot, by their nature, explicitly gain user consent. For these types of implementations, the approach to not only encrypt the information at source, but to also discard part of the original information at that point (such as some of the IP address or registration plate number), can ensure that data within the host system cannot be reverse engineered to reveal an individual’s identity. This could therefore act as an exemplar for data use in the broader context.
There are issues around privacy raised not only by autonomous vehicles, but by the way in which traffic systems in general use and store data. The difference with phones is that we can turn our mobiles off, when CAV’s become a feature of everyday life, we will not be able to ‘silence’ them. With a surfeit of CCTV cameras on our streets, does this make any difference? If you have ever seen films like ‘Enemy of the State’ you would think that everyone’s every step is already logged and recorded. In the UK, a TV program called ‘Crimewatch’ routinely showed such poor-quality images captured by CCTV cameras, that you wouldn’t think you have a lot to worry about. As technologies improve, however, the ability to use AI clients for facial recognition to track individuals across a variety of platforms or to use information purely for commercial benefit should be of concern.
Although the much-lauded safety and convenience benefits of CAV’s are being used to promote their adoption, these are still only aspirations. In the UK, the annual number of road traffic fatalities has stayed fairly constant at around 1700 per year for some time, so it is thought that a major change, such as the introduction of CAV’s, is needed to affect a significant further improvement. It would seem therefore that the arguments for their use is compelling, if not just in human terms but also financially (each fatality costs on average nearly £2m). The pressure is also growing from technology and automotive companies to start to earn money from their substantial investments. Nevertheless, the needs and rights of citizens should not be trampled on just to allow these companies to start to recoup on their outlay. The benefits of CAV’s should be huge for many reasons, but this must not be at the expense of our rights, and there should be ample opportunity to ensure this through the development and approval cycles for these new vehicles.
Although most of us do not have anything to hide, the issue is really about transparency regarding the reason for collecting data in the first place and what it is used for. Very few users have any comprehension of the amount of information held on them by the large tech companies, and in particular the way in which that information is later used. Those intent on criminal activities will continue to use ‘burner’ phones to hide their activities, but many do use other means to cloak their online activities for perfectly legitimate purposes, such as using the ‘incognito’ mode in the Chrome web-browser, if for no other reason than to get the best prices when trying to buy flights or holidays. It could therefore be possible to ensure that data used to operate vehicles on the highway network, such as Vehicle to Infrastructure (V2I) or Vehicle to Vehicle (V2V) communications, could be anonymised; and then add on services made available to users through a separate stream, such as their Google or Apple profiles. This separation of vehicle operational systems and add on recreational or premium services could therefore be used as a model to provide anonymity for users of services with CAV’s and allow them to dip in and out of other services.
Collectively, the public need to make a number of informed decisions about data gathering in the broader context, not just in the mobility sector. It concerns the recording of our activities to a degree of intimate detail that most people are blissfully unaware of. This is particularly important when taken in the context of the continuing globalisation of goods and services, which will often result in data transactions occurring in territories far removed from where we actually reside. The presence of the large tech companies now in our sector opens up a raft of issues which did not previously exist, their attitude to amassing and exploiting personal data for their own commercial benefit should be of concern to us, both personally and professionally. As an industry, we should be ensuring that we not only comply with the legal obligations on organisations to adhere to data privacy standards, but that we also work to best practice standards that may exceed the legislative requirements of a particular country, to ensure that data is protected and anonymised routinely.
From a moral stance, the decision process made in complex situations should be transparent to users to overcome bias in the way in which activities such as AI control of CAV’s is implemented in response to an incident situation. This would ensure that the most appropriate outcome would occur and give the public confidence. Germany has recently introduced ethical guidelines for the use of autonomous vehicles, which prioritise the preservation of human life and the prevention of injury over all other concerns. The guidelines stipulate that CAV’s should only be allowed to operate if a clear safety benefit over manually driven vehicles is achieved. Although this is the foremost guidance on the subject of ethics and CAV’s, the issues which surround the use of data is restricted to ensuring that users retain the rights over the data which is generated by them, stating: “Action should be taken at an early stage to counter a normative force of the factual, such as that prevailing in the case of data access by the operators of search engines or social networks”.
When specifying or procuring goods and services, we should always take account of the way in which these deal with data protection. By ensuring that our systems offer a robust level of protection to counter threats such as hacking, routinely anonymise data at source and destroy it when it is no longer required, we can provide the public with the assurance that we are working in line with best practice (which can be superior to legislative requirements in many territories).
So, do ethics and privacy concerns hinder progress?
In the short term, ethics and privacy probably do stifle the rate of progress, but the safeguards should be retained to ensure that we all retain our liberty. It would be very easy to sell our rights to the technology giants and vehicle manufacturers, but this could easily lead to a dystopian future where users’ privacy is eradicated, people are unfairly treated and ultimately our safety is compromised. Although some may have legitimate claims for access to data for other purposes, such as security, there are alternative sources of information which the authorities already have access to, which negates the need for any further erosion in privacy. Therefore, users’ rights to privacy and the use of ethical work regimes should be enshrined within the mobility sector.