Video surveillance technology

Surveillance Now: Is Technology Becoming More Powerful Than The Law?

By Dr Evronia Azer, Centre for Business in Society

A watchdog has recently warned[i] that rules are not keeping up with high-tech surveillance systems. The quick advancement of surveillance and monitoring technology requires continuous developments in the legal frameworks binding the test and use of such technology worldwide. This advancement of technology comes in different shapes and forms, as technology’s capabilities, tools and ways in which they are used, continue to change quickly, creating much controversy. In his annual report[ii], the Biometrics and Surveillance Camera Commissioner expressed how extraordinary the capabilities of artificial intelligence systems and facial recognition technology are now, while there is a clear need for legislation to guide the use of such systems and provide more accountability.

Complex Surveillance Systems

One example to show the advances in how surveillance is employed is the use of automatic number-plate recognition (ANPR) cameras in the UK to “enforce low-emission zones and check cars have insurance[iii], scanning thousands of lanes and storing up to 80 million readings daily. This is considered the largest non-military database in the UK, yet there is no proper legal framework that governs the capture and use of this data.

Car plates are not the only way in which artificial intelligence technologies are used for automatic identification. There is highly accurate facial recognition technology, which is used around airports, that could enable people to travel without passports by simply going through the gates and being identified through their biometrics. This is thought to allow travellers to get rid of their passports in the future[iv].

Facial recognition technology is not only used in airports, but all over the streets in different parts of the world[v], to attempt to identify individuals automatically and quickly. For example, video surveillance was used during the March 2019 climate protests in London[vi], which was attended by thousands of minors and children. This poses questions around children’s privacy and their inability to use the existing data protection regulation to their favour because of their age. In addition, because the incident was only uncovered after a Freedom of Information Request (FOIA), it raises questions about the extent to which surveillance is used, and what data is recorded and kept, without individuals’ knowledge. There are also questions about the accuracy of this facial recognition technology, with black people, particularly women, who have the highest error rate of recognition[vii], potentially making the algorithms racially discriminatory.

Are We Ever in Control?

The power of surveillance systems also goes beyond facial recognition. An extreme example from transnational contexts is how employers use devices that track employees’ brain waves in order to allegedly monitor their “fatigue and offer brain-wave tracking as part of wellness programmes designed to decrease stress and anxiety[viii]. This is more than seeing what employees do, but keeping track of what they feel or think, as if there is nothing to hide anymore, taking away control over individuals’ own bodies and thoughts. The storage and use of such big data over many years can also be used to produce analytical results about a person’s life and development.

To this end, there is a vast amount of data being collected all the time through different means such as drones, CCTV, dashcams, sensors, GPS tags and more. The power of technology here is not the problem. We need these advances in science and technology for the common good, such as the use of assistive technology to aid people with visual impairment[ix]. However, we always need to ask if the surveillance employed is justified: its extent, purpose, mechanisms?  

Legislation Needs to Develop Faster

Before using surveillance technologies, multi-level legislations locally then further internationally, should be suitable and fit-for-purpose, to protect the interests and rights of individuals. Advocacy groups need to be given more scope in campaigning for corporate social responsibility and calling out violations of human rights through surveillance. The public, especially most vulnerable and targeted groups such as refugees[x], migrants and activists, should not be made the easiest target of surveillance technologies without legal control. The sales of these technologies from the private sector to the world should be scrutinised, to ensure these aren’t further misused to manipulate vulnerable groups. We need to make sure organisations’ use of such advanced technology for monitoring is always guided by appropriate and up-to-date legal frameworks and governance. These frameworks should always be human-centric, serving the interests and rights of individuals first, in an unbiased way.

As the Biometrics and Surveillance Camera Commissioner stresses[xi], we need to balance “the technological possibilities with proper legal accountability in a way that meets the legitimate expectations of the public”, which will continue to be a regulatory challenge. With enough political will and expert intervention, this challenge can be overcome. However, without the desired controls and norms in place, this area continues to expand at an incredible rate. Regulation, transparency, governance and legal accountability cannot fail to keep pace, if the individual and society are to be safeguarded.  

Through understanding the impact of organisations’ activities, behaviours and policies, the Centre for Business in Society at Coventry University seeks to promote responsibility, to change behaviours, and to achieve better outcomes for economies, societies and the individual. 

References

Gayle, D., 2023. Met police illegally filmed children as young as 10 at climate protest. [Online]
Available at: https://www.theguardian.com/world/2022/dec/05/met-police-illegally-filmed-children-as-young-as-10-at-climate-protest
[Accessed 22 February 2023].

Marcus, A. D., 2023. When Your Boss Is Tracking Your Brain. [Online]
Available at: https://www.wsj.com/articles/brain-wave-tracking-privacy-b1bac329?mod=e2tw
[Accessed 22 February 2023].

Najibi, A., 2020. Racial Discrimination in Face Recognition Technology. [Online]
Available at: https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/
[Accessed 22 February 2023].

RTVE, 2023. In Las Rozas (Madrid) we see how one of the most advanced Artificial Intelligence video surveillance systems in Spain works. [Online]
Available at: https://twitter.com/rtve/status/1625610553317007368
[Accessed 22 February 2023].

Sampson, F., 2023. Biometrics and Surveillance Camera Commissioner: report 2021 to 2022, s.l.: Gov.uk.

Staniforth, S., 2023. Angela Rayner is asked whether she supports the policy of tagging asylum seekers. [Online]
Available at: https://twitter.com/SaulStaniforth/status/1625055576408326147
[Accessed 22 February 2023].

Topham, G., 2023. March of the robots: how biometric tech could kill off paper passports. [Online]
Available at: https://www.theguardian.com/politics/2023/feb/03/biometric-technology-paper-passports-redundant
[Accessed 22 February 2023].

Vallance, C., 2023. [Online]
Available at: https://www.bbc.co.uk/news/technology-64583997
[Accessed 22 February 2023].

Zhao, Y., Wu, S., Reynolds, L. & Azenkot, S., 2018. A Face Recognition Application for People with Visual Impairments: Understanding Use Beyond the Lab. CHI ’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Volume 215, pp. 1-14.


[i] (Vallance, 2023)

[ii] (Sampson, 2023)

[iii] (Vallance, 2023)

[iv] (Topham, 2023)

[v] (RTVE, 2023)

[vi] (Gayle, 2023)

[vii] (Najibi, 2020)

[viii] (Marcus, 2023)

[ix] (Zhao, et al., 2018)

[x] (Staniforth, 2023)

[xi] (Sampson, 2023)

Comments

comments