Skip to main content

Amazon Cloud Drive gets unlimited music storage, iPad support

When Amazon launched its Cloud Drive service in March, basic service originally came with 5GB of free space to store your music as well as a music player for the Web and Android devices. This week, Amazon rolled out a promotion that offers Cloud Drive users unlimited music storage, in addition to 20GB of storage space for a cost of $20 a year. Music file types included in the promotion are MP3 and AAC files. In addition, any Amazon MP3 music purchases are stored for free on the Cloud Drive, thus they don’t count against the total file storage limit.

Amazon-Cloud-Player-AndroidAmazon hasn’t indicated when it will halt the “limited” promotion and change the conditions of the service. Any customers that previously upgraded to the 20GB service will receive the unlimited music promotion. Amazon has also rolled out an iPad-optimized version of the Cloud Player that allows users to stream music directly through the Safari browser. There’s no app to download, as the interface is all Web-based. The iPhone and iPod Touch are currently not supported.

This announcement comes on the heels of the recent Spotify announcement that U.S. music streaming is coming soon. Amazon is also competing with Google Music Beta, a digital locker for music, and Apple’s upcoming iCloud service. Up to 20,000 songs can be stored with Google’s Music Beta, however music labels have been reluctant to work with Google’s service due to license negotiations. As an upgrade from the MobileMe service, iCloud is launching with iOS5 in the fall and allows users to store music, photos, books and email online. Apple also offers the iTunes Match service to convert up to 20,000 songs into the cloud for $24.99 a year.

Mike Flacy
By day, I'm the content and social media manager for High-Def Digest, Steve's Digicams and The CheckOut on Ben's Bargains…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more