The AI Wild West: Unraveling the Security and Privacy Risks of GenAI Apps

1 week ago 8
News Banner

Looking for an Interim or Fractional CTO to support your business?

Read more

Generative AI use within the workplace has exploded, with an estimated quarter of people currently using or having tried AI in the workplace. However, getting actual data on the specific type of apps being used and what employees are using them for has not been easy to come by. This is why we conducted our own research into a random sample of 1,000 enterprise employees who have used at least one GenAI app in the last three months.

GenAI users go ‘all in’

We found that on average, once employees start using GenAI, they tend to go ‘all in’ with users uploading data to 8.25 apps every month. Some 18.9% of these we can call ‘power users’ since they use more than 12 apps. But, on the other end of the scale, 10% only use one.

Monthly trends may be indicative of broader shifts, too. The number of apps used in July was 11% down on July, suggesting that employees might be refining their use cases and the apps they deem useful.

There are an incredible (and probably unsustainable) 5,020 GenAI or GenAI-enabled tools currently in use. A quarter (25%) of these claim to help with content including editing, creation, summarization, and translation, 18% are business tools such as Slack, Notion and Grammarly, while 13% provide customer service help such as streamlining support.

Perhaps unsurprisingly the most popular by far with users is ChatGPT with 84% of our sample having used it in July. This is 6X more popular than Google Gemini which is the next nearest app in terms of popularity used by 14%. Others featuring strongly include Microsoft Copilot, Perplexity, and Claude.

Content is king for use cases

We wanted to also assess how employees are using GenAI apps and for what purpose by analyzing user prompts. It quickly became clear that ‘Content creation, summarizing, or editing’ is the overwhelming business case with some 47% of prompts asking apps for help in this area. The next most popular is software engineering with 15% of the share. Others featuring strongly include data interpretation, processing, and analysis (12%), business and finance (7%), and problem solving and troubleshooting (6%).

Advertisement. Scroll to continue reading.

Nearly a third of apps declare they train on customer data

With this GenAI use comes potential privacy and security risks. We looked at all the 5,020 applications in use and found that 30.8% declare they train on customer data, which means that any sensitive data uploaded can be used to train their models. Furthermore, less than 1% have a ‘Trust Center’ – where it is possible to see at a glance crucial security and privacy settings.

Best Practices for Organizations

When it comes to data privacy and AI, simply being aware of the risks isn’t enough. It’s therefore important to take concrete steps to protect valuable data assets. Here are some key best practices to implement:‍

  • Regular Audits: Conduct regular audits of the apps used within your organization to understand their data practices
  • Clear Policies: Develop and enforce clear data usage and AI policies
  • User Training: Educate employees about the risks and best practices for using AI tools safely
  • Opt-Out: Where possible, opt out of content training to avoid exposing intellectual property – some make this easier to do than others, but a good number of the most popular apps do, this includes: OpenAI, Meta, Grammarly, Gemini, Perplexity and Figma

As AI continues to evolve, so do the challenges it brings. But by staying informed and adopting smart, data-centric security strategies, it is possible to navigate these challenges and harness the power of GenAI while safeguarding data.

Read Entire Article