Apart from privacy, the Safe City Malta project raises a number of issues that must be addressed by the Maltese authorities: Who will be storing the data? What will be done with the data? Who will be monitoring the data?
Collage by Isles of the Left
In April 2016, Prime Minister of Malta Joseph Muscat announced the setup of a new private-public partnership, called Safe City Malta. This project was proposed again last October during the annual budget reading. The Prime Minister stated that this pilot project would be introduced in Paceville and Marsa, the two localities identified as ‘crime hotspots’.
This new form of surveillance technology will utilise Artificial Intelligence (AI) controlled technology to monitor activity and to identify subjects though the use of algorithms that will, supposedly, match the faces of people with established databases. This project will be rolled out and developed by the Maltese State in cooperation with the Chinese tech-giant Huawei.
The initiative has been applauded by some, like the leader of the Nationalist Party, Adrian Delia, as a means of controlling criminal activity in troubled areas; but has equally caused an uproar among many others, among whom was Alternattiva Demokratika.
The Malta IT Law Association (MITLA) aptly stated that the introduction of such pervasive technologies must be overseen by strict data protection laws, especially with regards to the processing of sensitive and personal data. They also highlighted a number of other issues: facial recognition is to be understood as a form of biometric technology that can only be utilised in states of emergency and under strict regulations and restrictions. This concern has also been voiced by one of the directors of Safe City Malta, Cuschieri, in an interview with MaltaToday.
How could this development affect all of us in ways that are more permanent than we might assume at first?
However, before discussing whether facial recognition technology will be introduced or not, we should delve into other pivotal aspects of the issue. How could this development affect all of us in ways that are more permanent than we might assume at first?
Privacy vs Security
Primarily, most of the debate revolved around the issue of personal privacy and the right to one’s personal life remaining private.
Many have stated that they want to enjoy the freedom of going anywhere, especially entertainment areas, without being tracked or having it on record. In such cases, the issues are twofold: on the one hand, there is the lack of privacy through the monitoring of one’s movement and on the other hand, there is the risk of data being leaked or accessed by parties who should not be privy to such information. In a nutshell, the script goes like this: “I should have the right to go to Paceville and behave as I wish, as long as I stay within legal boundaries and without having third parties finding out about my leisure activities”.
Others, who favour heavier surveillance and security, countered by stating that no righteous and law-abiding individual should feel threatened if they respect the law. To the contrary, the argument continues, they should feel more secure and have more peace of mind that they are being protected from lawless and violent persons. Needless to say, similar opinions are voiced in debates about online surveillance and email monitoring.
The concerns about breach of privacy are valid, but have lost their power of persuasion since the majority of individuals relinquished their right to privacy long ago.
The concerns about breach of privacy are valid, but have lost their power of persuasion after the majority of individuals have relinquished their right to privacy either as social media and internet users, or as private citizens who have accepted surveillance legislation as a means of protection from terrorism or other crimes.
Very few may claim to have privacy when partying in Paceville. Most clubs have CCTV surveillance systems in place at their establishments. There are also 24-hour CCTV cameras surveying public spaces like public gardens and roads—supposedly to prevent vandalism and crime. Add to this the cameras installed in strategic spots by LESA (Local Enforcement System Agency) which allow the identification of car registration plates. Drivers are fined when parked unlawfully. Up till now, very few, if anyone, have contested these technologies as a breach of privacy; the majority accepted them as a norm.
Moreover, most individuals reading this article probably own a social media account. Facebook, one of the most popular platforms, boasts one of the best AI systems in face recognition, supposedly to help us tag our ‘friends’ when uploading photos on our page. Therefore, the chances are that either someone has posted photos of you during a night out or, more likely, you uploaded these pictures yourself. Indeed, many of us abandoned privacy long time ago, thus making it pretty much a given that cannot be contested anymore.
Having said that, the debate does not stop here. Apart from privacy, the project still raises a number of issues that must be addressed by the Maltese authorities: Who will be storing the data? What will be done with the data? Who will be monitoring the data?
Facial Recognition as a Tool for Social Profiling
An important question to ask is whether the data will be used and processed to create an information repository about individuals or groups of people (such as, for instance, ethnic, religious minorities or LGBITIQ).
At this stage, it is crucial to differentiate between ‘data’ and ‘information’ so as to further understand the broader consequences of this technology. Arguably, data may be defined as the raw material, whilst information is the processed product. Data on its own means little to nothing, unless it is interpreted and analysed for specific purposes. Depending on the purpose of data collecting, analytic tools are used to extrapolate patterns—what we may classify as information. And information is the basis of understanding subjects, objects and people.
Technically, unless processed using AI and specific algorithms, a constant footage of an area will only produce lots of data. Thus, the question we should be asking is what criteria will these algorithms be based upon and what data banks will be used to ‘train’ these algorithms.
What will happen with this information and how will it be utilised by law enforcement? Will we be living in a law enforcement system similar to the one featured in Minority Report? Is there the possibility that facial recognition technology will be used to analyse and monitor behaviour among specific groups of people (and discriminate against them)?
It is plausible that facial recognition will facilitate ethnic, racial or other forms of profiling.
Article 9, paragraph 1 of the GDPR adopted by the EU member states specifies that such a use is prohibited:
Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.
In terms of crime prevention or as a means of investigation, street surveillance is grossly inefficient. Most importantly, however, is that surveillance with a purpose of profiling is illegal and inadmissible.
Profiling City Malta
The way it is proposed, the Safe City Malta project could be aiming for social profiling. The very fact that Paceville and Marsa were identified as the locations to be monitored raises a few red flags.
On what criteria were these two localities chosen? Is it purely based on statistics or is there more to it that is not being said openly?
The official crime statistics indicate that Marsa is not one of the most troubled places in Malta, and yet, it has been targeted for this pervasive form of surveillance. Could it be that the current administration finds it easier to target areas mostly frequented by foreign nationals as opposed to locals? Or does the government predicate their proposal on mere speculation without conducting a proper sociological assessment of the locality? Could this be about gaining votes, considering that Marsa is frequently stigmatised as a hub for criminality? What is the government trying to achieve and whom is it seeking to please?
Given the trend, is there the possibility that a Maltese version of Social Credit System is also in the pipeline?
We must also keep in mind that the Safe City Project has a memorandum of understanding with the Chinese company Huawei, which has recently been accused of a number of breaches of sensitive user information among other offences. It is essential to keep in mind that Huawei is closely affiliated with the Chinese government, and thus influences the Chinese administration’s national and international technological vision and policies. Like other tech companies operating in China, it participates in implementing of the digital totalitarian Social Credit System that awards or sanctions behavior of all its citizens.
The current Maltese administration has unequivocally positioned Malta as the hub where the new and innovative technologies, including Blockchain and Cryptocurrency, could thrive. Given the trend, is there the possibility that a Maltese version of Social Credit System is also in the pipeline?
Considering that the major political parties on the island have, for years, sought personal information on their electorate (and that they maintain a strong grip on most media outlets), what would stop them from using the acquired data to further enhance their means of indoctrinating the electorate? One could dismiss this question as pure conjecture that verges on being a conspiracy theory, but these questions are more than adequate and are based on factual considerations.
What’s undeniable is the element of normalisation involved in this project. Over the years, CCTV surveillance has been adopted and even promoted by various parts of Maltese society, from local councils to concerned citizens. CCTV footage is today viewed as a means of protecting the environment, safeguarding it from vandalism or illegal dumping and as an enforcement technology to sanction and curtail traffic and parking abuses. The introduction of more pervasive forms of surveillance might at first cause some uproar but would eventually become part of daily practice.
Concerned individuals should ultimately be inquiring into the future of law enforcement and the use of public funds.
The finances allocated to projects like Safe City Malta should instead be redirected towards other—more positive—efforts of crime prevention. Wouldn’t Malta benefit more from a police force that offers better salaries and conditions to its members? Wouldn’t it be more effective to invest in their training and better working conditions? Better still, we would all benefit if the Safe City Malta’s budget was instead spent on assisting individuals who are falling through the safety net and are pushed towards possible criminal activity.