DMCA.com Protection Status Microsoft’s New AI Model Can Make You Talk From Photos With Real Facial Behavior: Should We Worry? – News18 – News Market

Microsoft’s New AI Model Can Make You Talk From Photos With Real Facial Behavior: Should We Worry? – News18

Microsoft’s New AI Model Can Make You Talk From Photos With Real Facial Behavior: Should We Worry? - News18

[ad_1]

Last Updated:

Microsoft's growth with its AI tools is reaching scary levels now

Microsoft’s growth with its AI tools is reaching scary levels now

Microsoft is building new AI tools that brings its concerns but is the company confident of regulating its misuse?

Microsoft is hell bent on getting its AI revolution moving at a fast pace, so much that the new AI models have become a concern. The company has invested billions into OpenAI to progress in this field, and now a new AI framework is capable of making your photos talk and even show human-like behaviour.

The new AI model called VASA-1 is able to master lip sync with these photos and the motion in these photos will hardly make you believe that they are not videos. If you don’t believe us, check out this demo of the Mona Lisa painting and those expressions you see have been developed using the VASA-1. Some of these AI-generated clips here, show the scary potential of the technology and what it could do in the wrong hands.

And that’s where we basically sit with the evolution of AI and the learnings picked up from vast resources on the internet are moving at an unimaginable pace. AI has become the favourite tool for hackers as well with AI deep fakes and voice cloning showing the eerie accuracy and also the dangers it poses.

But Microsoft seems to have taken all these issues to a new level with VASA-1 and regulating these tools is paramount, or else we are looking at a full mutiny from the AI tools that does not bode well for the future.

The company feels that the quality of these clips are still not authentic enough and it is making sure that the, “videos generated by this method still contain identifiable artifacts, and the numerical analysis shows that there’s still a gap to achieve the authenticity of real videos.” Microsoft is equally aware of its possible misuse but seems intent on overlooking them as the positives weigh far higher in its cause.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *