Skip to content

The Next Frontier of Military AI: Your Work Computer

Featured Sponsor

Store Link Sample Product
UK Artful Impressions Premiere Etsy Store


it’s probably hard Imagine that you are the target of espionage, but spying on employees is the next frontier of military AI. Surveillance techniques familiar to authoritarian dictatorships have now been repurposed to target American workers.

Over the past decade, a few dozen companies have sprung up to sell their employer subscriptions to services like “open source intelligence,” “reputation management,” and “insider threat assessment,” tools that were often originally developed by contractors. defense for intelligence uses. As deep learning and new data sources have become available in recent years, these tools have become dramatically more sophisticated. With them, your boss can use advanced data analytics to identify labor organization, internal leaks, and company critics.

It’s no secret that unionization is here overseen by large companies like Amazon. But the expansion and standardization of tools to track workers has drawn little comment, despite its ominous origins. If they are as powerful as they claim to be, or even headed in that direction, we need a public conversation about the wisdom of transferring this informational ammunition into private hands. Military-grade AI was meant to target our national enemies, nominally under the control of democratically elected governments, with safeguards to prevent its use against citizens. We should all be troubled by the idea that the same systems can now be widely implemented by anyone who can pay.

FiveCast, for example, started out as a anti-terrorism start-up selling to the militarybut has turned its tools over to corporations and law enforcement, who can use them to collect and analyze all kinds of publicly available data, including your posts on social media. Instead of just counting keywords, FiveCast boasts that it’s “commercial security” and other offerings can identify networks of people, read text within images, and even detect objects, images, logos, emotions, and concepts within media content. Its “supply chain risk management” tool aims to forecast future disruptions, such as strikesfor corporations.

Network analysis tools developed to identify terrorist cells can be used to identify key labor organizers so that employers can illegally fire them before a union is formed. He standard use of these tools during recruitment it may encourage employers to avoid hiring such organizers in the first place. And quantitative risk assessment strategies designed to warn the nation against impending attacks can now inform investment decisions, such as whether to divest from areas and vendors estimated to have a high capacity for labor organization.

It is not clear that these tools can live up to their expectations. For example, network analysis methods allocate risk by association, which means that you could be flagged simply for following a particular page or account. These systems can also be fooled by fake content, which is easily produced at scale with new generative AI. And some companies offer sophisticated machine learning techniques, such as deep learning, to identify content that seems angry, which is supposed to flag complaints that could result in unionization, even though emotion detection has been shown to be biased and based on wrong assumptions.

But the capabilities of these systems are growing rapidly. Companies announce that they will soon include next-generation AI technologies in their surveillance tools. The new features promise to make it easier to explore varied data sources across indications, but the end goal seems to be a routine, semi-automated, union-busting surveillance system.

Also, these subscription services work even if they don’t. It may not matter if an employee labeled as a troublemaker is genuinely disgruntled; executives and corporate security could still act on the allegation and unfairly retaliate against them. Vague aggregate judgments of a workforce’s “emotions” or a company’s public image are currently impossible to verify as accurate. And the mere presence of these systems likely has a chilling effect on legally protected behaviors, including labor organization.


—————————————————-

Source link

We’re happy to share our sponsored content because that’s how we monetize our site!

Article Link
UK Artful Impressions Premiere Etsy Store
Sponsored Content View
ASUS Vivobook Review View
Ted Lasso’s MacBook Guide View
Alpilean Energy Boost View
Japanese Weight Loss View
MacBook Air i3 vs i5 View
Liberty Shield View
🔥📰 For more news and articles, click here to see our full list. 🌟✨

👍🎉 Don’t forget to follow and like our Facebook page for more updates and amazing content: Decorris List on Facebook 🌟💯

📸✨ Follow us on Instagram for more news and updates: @decorrislist 🚀🌐

🎨✨ Follow UK Artful Impressions on Instagram for more digital creative designs: @ukartfulimpressions 🚀🌐

🎨✨ Follow our Premier Etsy Store, UK Artful Impressions, for more digital templates and updates: UK Artful Impressions 🚀🌐