In January, Microsoft generative AI image maker Designer was reportedly used to create explicit deepfake images of pop artist Taylor Swift that later went viral on X formerly Twitter . While Microsoft stated it found no evidence that Designer was actually used to make those images, other media reports claimed that the company did make changes to Designer to prevent it from making those kinds of images. On Thursday, Microsoft security blog posted a new entry that offered more details on how th

Read the full article at Neowin