Generative AI

Bedrock GenAI Infrastructure Subjected to LLM Hijacking

Adobe Stock

Amazon Bedrock environments supporting generative artificial intelligence tools' large language models have been increasingly breached by threat actors using exposed AWS access keys, SC Media reports.

Amazon Bedrock environments supporting generative artificial intelligence tools' large language models have been increasingly accessed by threat actors using exposed AWS access keys, SC Media reports. One actor's LLM hijacking activities were aimed at allowing prompts regarding sexual roleplay.

"We remain committed to implementing strict policies and advanced technologies to protect users, as well as publishing our own research so that other AI developers can learn from it. We appreciate the research community's efforts in highlighting potential vulnerabilities," said an Anthropic spokesperson.

Amazon said, "AWS services are operating securely, as designed, and no customer action is needed. The researchers devised a testing scenario that deliberately disregarded security best practices to test what may happen in a very specific scenario. No customers were put at risk. To carry out this research, security researchers ignored fundamental security best practices and publicly shared an access key on the internet to observe what would happen. AWS, nonetheless, quickly and automatically identified the exposure and notified the researchers, who opted not to take action. We then identified suspected compromised activity and took additional action to further restrict the account, which stopped this abuse. We recommend customers follow security best practices, such as protecting their access keys and avoiding the use of long-term keys to the extent possible. We thank Permiso Security for engaging AWS Security."

Meanwhile, AWS customers have been urged by Permiso p0 Labs Senior Vice President Ian Ahl to track the utilization of long-term access keys for certain APIs to curb the risk of LLM hijacking.

Related Terms

Algorithm

You can skip this ad in 5 seconds