Skip to main content

Mashup Your RSS Feeds with Yahoo Pipes

Mashup Your RSS Feeds with Yahoo Pipes

RSS and XML feeds might be all the rage for keeping up with your friends’ self-absorbed blog entries or self-absorbed cat photos over at Flickr, Yahoo has rolled out a beta edition of a new hosted service called “Pipes.”

The basic idea behind Pipes is to offer users a visual interface to mixing, matching, and mashing up various RSS and XML data sources available over the Internet to create new, highly-personalized data feeds. Pipes can accept user input—like names, dates, numbers, and locations—and use them to filter information from a variety of sources, construct custom searches and queries, and integrate information from multiple sources into one, concise RSS feed. Right now Pipes only outputs data in RSS format, but Yahoo hopes to expand output options to include badges, maps, and other forms of structured XML data; Yahoo also wants to add support for non-RSS input data, add more processing modules to transform and mutate fetched data, and offer deeper access to the Pipes engine.

Despite the visual configuration environment, Pipes in its current form is distinctly a tool for power users and programmers (the name itself derives from a Unix concept which lets users chain the output of command-line programs to each other to perform complex tasks in a single step) but as the number of pre-formed custom Pipes grows, many of them will be useful to (and, hopefully, tweakable by) the sort of folks who are comfortable with, say, making email filters or constructing search queries.

Current Pipes examples pull together all Yahoo’s official blogs, offer an apartment search capability, offer to find online pictures new a particular place, and aggregate news alerts together. If you’re an RSS fiend, Pipes may be worth checking out; if you’re an XML maven, Pipes might be an interesting forerunner of “Web 2.5.”

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more