Artificial-Intelligence-Microsoft

There’s no doubt that AI will play an essential role in our future, becoming increasingly integrated even with ourselves. Of course, anyone who has seen the Matrix or Terminator movies will know giving AI too much power results in the end of civilization as we know it. No-one wants that to happen, so Microsoft has published new guidelines to avoid it.

Ok, that is a flippant introduction, but there are obvious concerns about AI. While sci-fi media may seem fanciful, in the wrong hands AI could be harmful. Furthermore, we still do not know the endgame of the technology and what happens with increased autonomy.

Having clear standards for development of AI solutions is one way to ensure the potential risks are minimized. And that is what Microsoft’s guidelines set to achieve. Created in collaboration with Boston Consulting Group (BCG), the new document sets development guidelines for AI tech.

Advertisement

Called the “Ten Guidelines for Product Leaders to Implement AI Responsibly,” the document aims to develop core standards that help us avoid a machine uprising. Or more realistically, prevent AI being used for nefarious activities such as cybercrime.

Guidelines

In blog post, Alysa Taylor Corporate Vice President of Industry, Apps, and Data Marketing at Microsoft, explains why Microsoft authored the document:

“As AI becomes more deeply embedded in our everyday lives, it is incumbent upon all of us to be thoughtful and responsible in how we apply it to benefit people and society. A principled approach to responsible AI will be essential for every organization as this technology matures. As technical and product leaders look to adopt responsible AI practices and tools, there are several challenges including identifying the approach that is best suited to their organizations, products and market.”

There are 10 guidelines in total, which Microsoft groups into the following three categories:

  1. Assess and prepare: Evaluate the product’s benefits, the technology, the potential risks, and the team.
  2. Design, build, and document: Review the impacts, unique considerations, and the documentation practice.
  3. Validate and support: Select the testing procedures and the support to ensure products work as intended.

You can read and download the document from Microsoft Azure here.  

Tip of the day: Due to the various problems that arise with microphones, it can often be necessary to perform a mic test. Microsoft’s OS doesn’t make it especially intuitive to listen to microphone playback or play the microphone through speakers. In our tutorial we show you how to hear yourself on mic with just a few clicks.

Advertisement