XML News from Thursday, January 7, 2010

This morning a simple practice exercise to get my toes wet. First load my Tweets from their Atom feed into eXist:

xquery version "1.0";
declare namespace xmldb="http://exist-db.org/xquery/xmldb";
let $collection := xmldb:create-collection("/db", "twitter")
let $filename := ""
let $URI := xs:anyURI("file:///Users/elharo/backups/elharo_statuses.xml")
let $retcode := xmldb:store($collection, "elharo", $URI)
return $retcode

Then generate HTML of each tweet:

xquery version "1.0";
declare namespace atom="http://www.w3.org/2005/Atom";
for $entry in document("/db/twitter/elharo")/atom:feed/atom:entry
   return 
  <p>{$entry/atom:updated/text()} {substring-after($entry/atom:title/text(), "elharo:")}</p>

Can I reverse them so they go forward in time? Yes, easily:

for $entry in reverse(document("/db/twitter/elharo")/atom:feed/atom:entry)

Now how do I dump that to a file? Maybe something like this?

xquery version "1.0";
declare namespace atom="http://www.w3.org/2005/Atom";
let $tweets := <html> {for $entry in document("/db/twitter/elharo")/atom:feed/atom:entry
   return 
  <p>{$entry/atom:updated/text()} {substring-after($entry/atom:title/text(), "elharo:")}</p>
} </html>
return  xmldb:store("/db/twitter", "/Users/elharo/tmp/tweets.html", $tweets)

Oh damn. Almost, but that puts it back into the database instead of the filesystem. Still I can now run a query that grabs just that and copy and paste the result since there's only 1. The first query gave almost 1000 results and the query sandbox only shows one at a time.

Tomorrow: how do I serve that query as a web page?