- Alan Turing Institute has outlined recommendations to protect UK AI research
- National State threat actors pose a serious risk of Britain’s AI development
- Universities are increasingly targeted so need to protect protection
The Alan Turing Institute has issued a report warning ‘Urgent Action’ is needed to protect Britain’s ‘world’s leading AI research ecosystem’.
There is a need for an urgent, coordinated reaction from the British government and institutions for higher education, the report to develop the protection of the research sector said. This includes recommendations to create a classified mapping of the AI education system for higher education and provide guidance to universities.
Institutions for higher education in the UK are increasingly targeting threat actors, with almost half experiencing a cyberattack every week. The report confirms that nation -based actors have been discovered using “espionage, theft and duplicitous cooperation” to try to keep up with Britain’s research and development.
Cultural change
The rapid development of AI research makes it vulnerable to nation-supported threat actors who want to steal intellectual property and use the malicious purpose.
Concerns were raised over hostile states that potentially gained access to technology’s “double use”, which means the tool can be reused or, in vision, constructed to be used for malicious activity, such as defense tools converted to help attackers.
The report outlines a need for a change in culture to focus on building risk awareness and security setting and encouraging “consistent compliance” with guidelines and best practices.
The research also wants to tackle Britain’s AI skills by ensuring that domestic talent is preserved and provides research security training for staff and research students. Research -intensive universities are also advised to set up research study committees to support risk assessments for AI researchers.
“Promoting AI research is rightly a highest priority for the United Kingdom, but the accompanying security risks cannot be ignored when the world around us becomes increasingly unstable,” says Megan Hughes, research assistant at the Alan Turing Institute.
“Academia and the government must commit to and support this long -over -due cultural change to create the right balance between academic freedom and protect this important asset.”