Skip to main content

TurningArt lets you decorate without committing

TurningArtRegretting that giant Bob Marley mural you bought in college? Sick of staring at the same framed photos of trees you’ve had since you moved in? Desperately need to class up your place? TurningArt’s artwork exchange service can solve your problems. Like Gamefly or Netflix, the site allows users to flex their preferences without going broke.

TurningArt ships members a framed work of art regularly based on which subscription fee you select.  At most, you can pay $19.99 a month for new art whenever you want, and for the minimum of $9.99 a month, a new piece is shipped to you every three months. Those prices may sound steep, but your other options are shelling out far more to own prints or buying cheap, easily ruined poster versions. With your first shipment, you’re gifted a custom-made black or gold frame.

Like Netflix, you build a queue of quality prints you want to temporarily own. The site’s options tend to lean towards contemporary style, but also provide more traditional prints. Each piece measures 12 by 16 inches, so it can fit the frame you receive. And you’re able to choose from hundreds of works, which range in style, medium, and price. Are you wondering why price matters if you’re renting the print? TurningArt also gives users the option of purchasing the artwork for keeps, and has a rewards program that can earn you considerable discounts.

TurningArt’s browsing options are set up so you can get a good look at the piece while also learning a little about the artist and genre. You can also see other work by the artist, and share the link on Twitter and Facebook.

What’s arguably most appealing about TurningArt is its gifting feature. Buying art for someone is always a risk, and you can eliminate the forced “Oh I love it…” next time you try to buy a print for someone.

Molly McHugh
Former Digital Trends Contributor
Before coming to Digital Trends, Molly worked as a freelance writer, occasional photographer, and general technical lackey…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more