Skip to main content

Shaky internet? Have someone else upload your files to Google’s Cloud Storage platform

shaky internet have someone else upload your files to googles cloud storage platform harddriveworld1
Image used with permission by copyright holder
Google is rolling out a solution that will allow users to send their physical media to a third party where it can be uploaded to the cloud, as announced on its Cloud Storage Platform page. Essentially, users could have used Google Cloud Storage to do this on their own all along, but the third party solution allows users with unreliable, slow, or expensive Internet connections to make use of the cloud.

Physical media like hard disk drives (HDDs), tapes, and USB flash drives can be uploaded by third party companies. Presumably, other types of physical storage would also be supported, as there could be people out there with tons of floppy disks or Zip disks as well, but Google doesn’t officially list those.

Users can choose any third party that is able to get the data from their physical media and upload it, though Google does officially list one services on its page; a company called Iron Mountain for those in the USA. For those outside the US, Google simply lists that companies are “Coming Soon.” This means that users in those parts of the world will need to find their own service that can extract and upload the data.

While this feature is fully supported by Google, it also points out that arrangements need to be made between the user and the company doing it. Google is not involved in the process outside of being the company actually hosting the files. It also points out that it “does not provide, support or endorse Offline Media Import / Export services, and does not receive a fee or commission from Offline Media Import / Export services.”

Google charges separately for its Cloud Storage platform based on data stored and bandwidth. Prices vary based on where in the world the bandwidth is from, and the type of storage the users need.

Dave LeClair
Former Digital Trends Contributor
Dave LeClair has been writing about tech and gaming since 2007. He's covered events, hosted podcasts, created videos, and…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more