Creating A RSS Feed From Google Spreadsheet

December 9, 2012 at 20:11 (Tech)

It is possible to use the Publish To Web option to create a RSS feed from a Google Spreadsheet. There are a few issues to note:

  • As with other Google products your Google ID (email address) may be made publicly available in the published results.
  • Because the entire feed is generated whenever a change is made to the source document, all items will always have the same published date.

There is very little documentation explaining how any of the RSS options operate or what structure the source document needs to take.

I’ve deduced the following:

  • Channel.Title is taken from the Sheet Name
  • Channel.Link is a link to the default (HTML) published view on the source document
  • Channel.Description is not set
  • Channel.LastBuildDate last saved timestamp of the source document
  • Channel.managingEditor Google username

The items are created differently depending if you select Cells or List

Cells traverses the sheet from left to right, top to bottom and creates an item for each cell

  • Item.title is the cell reference e.g. A1 or B2
  • Item.description contains the cell contents

The list option allows more data to be added to the feed. Apart from Row 1 each row will generate a new item.

  • Column A is used for each item.Title
  • Item.Description contains a string made up from the contents of the rest of the row
    • The Row1 value is used for a label is it is blank it will used an internal cell reference (e.g. _cokwr so it is best to use something)
    • The label is followed by a colon and space and then the contents of the next column i.e “: ”
    • The remaining columns will be added to the string separated by a comma e.g. “: , : ”
  • Each row generates a new item
  • The guid for each item is a link to the xml for the single cell

URL Format

The URL for the feed is made up in the following way:

Followed by the following options for Feeds:

  • /basic/ – Provides basic atom feed
  • /basic?alt=rss – RSS Version

Or other output options e.g.

  • &output=pdf – PDF Version of spreadsheet
  • &output =text – Text version
  • &output=html – Html version

It is also possible to only display a range by adding the following parameter:

  • e.g &range=A1%3AG5 for A1 to G5 (%3A is the url encoding for a colon)
Advertisements

Permalink Leave a Comment

Local Development For Heroku Facebook Apps

May 12, 2012 at 12:09 (Development, Facebook, Tech)

Disclaimer: At the moment this is mostly a brain dump because I don’t want to forget everything I seem to be having to do to get this setup

When Heroku and Facebook joined together it seemed like a great way for me to get back into development without having to worry about sorting out some new hosting and other things that I didn’t have time or money to invest in just to play around some more.

Heroku have a good getting started tutorial and sample app:  https://devcenter.heroku.com/articles/facebook

which was all going fine until “For development, you’ll need to register another app with Facebook. Unlike the first app, this one will not run on Heroku, but instead will run on your local workstation”. This did make me wonder if both Heroku and Facebook are going about things the wrong way. Shouldn’t the first step be teaching people how to develop locally and then deploy live. I’d have hoped that Facebook would do more to provide support for development versions of apps instead of manually having to duplicate everything. Interestingly I just read this blog about the lack of fundamentals in web programming education which I completely agree with.

The first problem I had was setting the facebook app env variables. This meant setting up a virtual host and was back to all the things I didn’t know about and wanted to avoid. I followed the instructions but it didn’t seem to work.

I hacked my way around this to get things working by adding the following code to the top of the AppInfo.php file

if($_SERVER['HTTP_HOST']=='localhost'){
 putenv("FACEBOOK_APP_ID=xxxxxxxxxxxxxxx");
 putenv("FACEBOOK_SECRET=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
}

Not nice or recommended but it allowed me to make some progress.

Although not much progress, I was now getting through to Facebook but not getting much further and failing to authorise and no clues as to why. I now can’t remember the exact details but somehow debugged the code  in FBUtils/login.php to find out why nothing was returning from the CURL request (something else I knew nothing about). Turned out the problem was because the app was not being accessed via https, something all Facebook apps now need to do.

The next step was therefore to setup SSL on my local WAMP setup. I found several different articles on how to do this and tried a few all without any success. Finally I found this http://www.phpjoel.com/2011/04/07/installing-ssl-using-openssl-on-a-wamp-localhost/ which was not only the simplest example it also seemed to work.

I had to debug one final problem by running httpd from the command prompt which told me there was a syntax error on the CustomLog in httpd-ssl.conf, I added in an extra quote and everything now seemed to be working. When I say working, I mean working a little more. I could now access my localhost via https and the Facebook app was attempting to verify. Unfortunately I was now getting errors back from Facebook “API Error Description: The specified URL is not owned by the application”. From the URL it looked like this was still trying to access http rather than https and I just needed to update the configuration settings of the app. I’ve tried changing the URL but it’s still not working, I’m hoping this is just a matter of waiting several minutes to propagate to all servers but it’s been a while now and it’s still not working. Considering I first started the Heroku in February I’m pleased to finally have it working this far and will have to come back to work out what’s wrong with the configuration some other time.

Permalink 1 Comment

Nearly There

March 25, 2012 at 19:31 (Misc, Tech)

It can be difficult starting things but it’s also quite hard to completely finish something. I’ve noticed this myself quite a few times recently in a variety of different ways. The bulk of the work gets done but there are always one or two little things that seem really difficult to tie down and complete. Other things are deemed not that important and get pushed back and often forgotten about. The strange thing is that’s it’s often these extra that provide the final polish and make the difference between good and great.

It is really great to see BBC iPlayer finally arrive on the Xbox, I’ve not used it much but it seems like one of the best implementations of the service. There are of course small problems which I can’t help but notice and wonder if these fall in to that category of not really required but would make such a big difference. Firstly a problem across all of the Metro UI is the lack of full support for the Xbox Controller. It annoys me greatly having to move up and down between different carousels when every user has access to a controller with two analogue sticks and shoulder buttons, the navigation should be so much smoother. The other “lack of polish” is not having a unified account for iPlayer across all devices and the lack of playlists of any kinds. Where ever you come across a programme you should be able to add it to a playlist so you can watch it on any device later. I’ve lost count of the amount of times I’ve said “oh I must catch up with that on iPlayer later” and then never have.

Draw Something is another fantastic piece of software although the most surprising thing about it is that it hasn’t been done earlier. The game is (probably purposely) simplistic, incredibly successful and enjoyable to play but it would be so much better with some form of chat or commenting after each round. No doubt this will come in future versions, possibly a premium version and perhaps that is the best way of developing software these days; get the basic mechanics working, released and popular and then add the final polish later.

Permalink Leave a Comment

Syncplicity +1

September 17, 2011 at 17:49 (Tech) ()

I mainly use Syncplicity as a backup tool, which isn’t really it’s main raison d’être, but works really well. All the folders I select are updated to the cloud whenever I change them. This means I can also download these files from any connected device. The default option of fully synchronised folders (as opposed to back up only) would also download any uploaded files back to your original machine. Whilst I can understand how useful this may be I really didn’t like the idea of my cloud files automatically replacing local files, what if my account got hacked? There are backups and revisions so it probably wouldn’t be the end of the world but it didn’t sit easy with me.

I use the “fully synchronised” folder on one folder on my computer and this acts as a (sorry, I can’t think of another word for this) dropbox, allowing me an easy way to send files back to my machine over the web interface.

I wasn’t quite sure what would happen when I added another computer into the mix. I wanted the same set up, to be able to back my files up to the cloud. After installations and logging into your account you are presented with a list of folders Syncplicity already knows about. You can add these folders to the new computer and when you do you are prompted for a location to store them. Unfortunately this defaults to “My Documents” (I hate things that use My Documents, it’s supposed to be mine and you’re trying to store things that may not be documents) but this can be changed. For “Fully Synchronised” folders, it does what you expect and any file added to the folder on any of the machines or via the web interface will appear in all of the others. I added my ‘dropbox’ folder and, sure enough, all my files were downloaded to the new computer.

I’m not sure what I was expecting but I was surprised when I added the “Back Up Only” folder from computer A to computer B and it started copying files across. This makes sense I had told computer B that I wanted this backed up folder and so it adds the backup from the cloud. It wasn’t quite clear what would happen next and unfortunately Syncplicty’s UI doesn’t handle this setup very well. The “Back Up Only” folder from computer A is listed as “Fully Synchronised” on Computer B and therefore this means that if you make changes on Computer B they are synchronised back to the cloud and vice-versa, but NOT back to Computer A. This is probably the correct behaviour but it confused me for a while and perhaps needs to be handled by Syncplicity a bit better. The UI could be improved to deal with multiple machines, perhaps by listing backup and synchronised folders differently. It also gets potentially confusing if you do have a “Documents” folder on each machine and only want to back them up, all folders appear in the same list in Syncplicity which means you have to go back into the client software to add a label. It would make more sense if non-synchronised folders were listed by their original source.

After a bit of experimenting I decided to have a “Syncplicity Backup” folder on each machine, each containing a back up of selected directories from the other computer. This means that I have a copy of those files on both machines but if I do make changes to the backup I have to manually copy them back to the original source. This feels safer than automatically synchronising, plus it still gives me the benefit of not having to copy things to a USB drive or downloading them from the cloud.

I’d be more tempted to fully synchronise more folders if I wasn’t so paranoid about accidentally or maliciously losing (or adding, for that matter) files in the web interface and have them appear on my machines. If only there was an option to not allow synchronisation from things changed via the web interface. Even a prompt or preview of incoming changes would be a huge improvement.

I’m still really impressed with Syncplicity, even if I’m not using it as it was intended, I think I just need to work out what folders I may need where.

Permalink Leave a Comment

Backing Up

September 4, 2011 at 14:23 (Tech) ()

One thing I was pleased about during my recent laptop problems was I felt fairly confident that in the worst case scenario I would be able to reformat and retrieve all my data from various backups. I’ve always known I should back up but until relatively recently this was a somewhat sporadical and very manual process. Hopefully I know have a much better process in place.

Firstly I have a free account with Mozy which allows me to backup upto 2GB of data online. The software runs every night, checks the file locations I’ve selected and saves any changes since the last backup.

Earlier this year I started using Syncplicity, initially as a replacement for DropBox (which I really ended up disliking a lot), but now find it acting as a  reassuring safety net. Like Mozy you select which folders and files you want to sync and Syncplicity runs in the background and updates any files as you change them. This means you also have version control on all your files, the options to view them online on any computer and also sync them to other computers if you want. You can also edit the files online and have them sync back to your own machine (although I don’t use this option). Another great feature of Syncplicity is you can enable it for Google Docs to create local backups of all your Google Documents.

I’m not entirely sure if I really need to be using both Mozy and Syncplicity for the same files but I am. I like the synching of Syncplicity and the fact that I can download the full backup set from Mozy.

Because I’m only using the free accounts with 2GB of space, I don’t use these services to backup photos. I upload most of my photos to private albums on a Pro Flickr account. I’m also one of the few people who make use of Windows Briefcases to create a sync copy of my photo directories on an external hard drive which I try and update weekly. Finally, every few months I burn all new photos to discs.

It seems to work for me, I was slightly concerned about everything being in the cloud but I think I’m getting over that now. I could do with a way of backing up Gmail and now I need to work out how to introduce another computer in to the process, hopefully Syncplicity will take care of that.

 

Permalink Leave a Comment

The Curious Case Of The Disappearing Disk Space

August 25, 2011 at 09:27 (Tech) ()

My laptop is a few years old now so I don’t expect it to run smoothly but it seemed to be getting worse. It finally clicked that part of the problem may be a nearly full hard drive, I’ve been moving around lots of video and it’s rare for my hard drive not to be practically full. So I started to tidy things up and clear some space, I removed games I was never going to play, uninstalled programs I didn’t use and cleared out quite a bit of space. The next day I didn’t have as much free space as I thought I’d cleared but I couldn’t be sure. I kept an eye on it and sure enough I was losing 5-10GB of disk space on a regular basis. I suspected some kind of virus but scans found nothing, could it be a problem with the disk? Too many defragmented files from moving huge videos around a lot? Disk cleaning tools cleared out space but not GBs of space.

I found out in Vista that 15% of the disk is allocated to shadow copies and restore points and there isn’t an easy way to change this. This had to be what was taking up the space, 15% would be 43GB which is quite a lot. So I restored to an earlier point before I noticed the problem and then cleared subsequent restored point. I started making backups and removed the large video files, finally I had 90GB free which I was happy with even if the shadow copy took another 40GB of it. I figured now would be a good time to run defragmenter on the disk and left it running overnight (and then some).

When the defragmentation was complete I had 5GB free. Something was still wrong. Time for more research. This forum post confirmed that it probably was a volume shadow copy problem. Reading this article at ZDnet made me aware but cautious of the vssadmin command and I found more details on vssadmin here.

Sure enough running vssadmin list shadowstorage showed the following:

Used Shadow Copy Storage space: 198.061 GB
Allocated Shadow Copy Storage space: 200.924 GB
Maximum Shadow Copy Storage space: UNBOUNDED

Allocated storage space 200GB! Maximum space Unbounded! It turns out that I wasn’t alone in having this problem. I can’t begin to imagine why anyone would desire no limit to be set, surely that just means all of your disk space would eventually be used. Turning System Restore Off and on again didn’t seem to recalculate the maximum space limit but at least cleared the other restore points. Funnily enough I don’t think I’ve ever used system restore apart from this week when I’ve been having this problem.

So I had to to set it manually:

vssadmin resize shadowstorage /for=c: /on=c: /maxsize=20GB

20GB is 7% but still seems generous. After a reboot I unsurprisingly had an extra 200GB free, hallelujah!

It seems to be okay now, I hope it stays that way, although I don’t quite know what to do with all that space now.

 

Permalink Leave a Comment

A Simple Framework

April 16, 2011 at 16:32 (Development, PHP) ()

For many reasons I’ve been wanting to develop something new and from scratch for quite a while. Mainly because most of the coding I do seems to be tweaking existing code or copying and pasting existing structures. After reading a few different tutorials on building a MVC Framework I thought I could join every developer and their dog and have a go at building a framework of my own. It was intended as an educational exercise for myself and so I’ve kept things as simple as possible, especially for the first version. It’s not an MVC framework as I haven’t yet included any support for models. At this stage it’s Controller Action View but it is more or less working now. I still need to finish it off and then decide how practical it is to actually build something with.  Who knows, I may then even release it for scrutiny from others.

In true He-Man style here are a few things I’ve learnt (or rediscovered) in the process:

  • mod rewrites may well be working but just redirecting to the wrong place and producing page not found errors
  • setting <base href> won’t work in IE if you have invalid html before it (during debugging)
  • There’s no need to use DIRECTORY_SEPARATOR constant (I didn’t even know it existed) when building paths as a forward slash will work in Windows
  • You can use reflection to check the number of required arguments when dynamically calling functions

e.g.

$method = new ReflectionMethod($controller,$action);
$requiredArgs = $method->getNumberOfRequiredParameters();

Permalink Leave a Comment

Creating A Big Buzz On A Small Budget

March 27, 2010 at 10:32 (Keswick Film, Tech)

Yesterday I wore my Keswick Film Club hat and attend a workshop to find out ways of using social media tools and new technology to promote events and build communities.  I wasn’t familiar with Christian Payne (aka Documentally) who was running the workshop but it turned out I did know his work.  He was involved with the campaign last year to restore Bletchley Park which hit the headlines when Stephen Fry got involved and (accidentally) tweeted the birth of his child.  I wasn’t quite sure what to expect, I thought I knew a bit about social media already and the day might just be a basic introduction to Twitter, but it turned out to be much more than that.

It all took place in the Cornerhouse in Manchester which seemed like a great place and our lunch was fantastic, I wish I’d had time to look around a bit more but I hope to go back someday.  As expected most of the day focussed on Twitter but as Christian/Documentally pointed out the skills you need to make effective use of Twitter will transfer to any social media, including whatever may eventually replace Twitter and more than likely your networks will migrate as well.  One of the most interesting things was watching people who “didn’t get Twitter” at the start of the day slowly come to understand what a wonderful and powerful thing (I wanted to say tool but it’s so much more than a tool) it is and embrace it.  I really hope these people stick with it, I consider myself a geek and it took me a while to fully understand (I’m still not sure if I really do) and it took a car accident for Documentally to get it.

Here a some of the useful things I took away with me to make the most of Twitter:

  • Once again it’s all about who you know, build a network and interact with your network.
  • Make sure you have a URL and Bio filled in so when people stumble across you on Twitter (and that’s perhaps the point of Twitter) people can find out who you are and what you do.
  • If you have an account for your organisation, use your real name on the profile.  It’s much easier to build relations with a person than a faceless organisation and people are less likely to be negative to a real person.
  • If you are an organisation and many people use the account put their details in the Bio and end each tweet with your initials (some apps will do this for you automatically).
  • Clicking your avatar takes you to the full size picture so use a good sized image and when people want a picture of you it’s already there.
  • Use the same avatar everywhere, it’s still all about branding and you want people to recognise you wherever you are.
  • Similarly don’t change your Avatar on a regular basis, a company doesn’t change it’s logo on a weekly basis and it’s no different here.
  • The mentions/replies page is the most important part of Twitter, you need to know what people are saying about you and don’t want to appear to ignore people.
  • Lists, it’s all about lists now:
    • Follow more people and use lists to filter them (you don’t even have to follow them to have them on a list).
    • Look what lists people are adding you to and see who else is on that list, somebody thinks you have something in common so maybe they should be in your network.
    • Follow other people’s lists that interest you.

Even for me, already knowing quite a bit,  it was a lot to fit into one day but that’s not necessarily a bad thing.  Quite a bit of time was also spent on audio and video blogging which I didn’t think I’d be that interested in, but I hadn’t realised how easy it is to do and maybe it’s something we should be considering at the Film Festival.

It was a really informative and enjoyable day and I think everyone who attended went away with lots to think about.


Oh and just so I know where to find them here’s a list of some sites/services/apps that were mentioned and may be useful:

  • Xmarks
  • DABR
  • TweetReach
  • Posterous
  • Backblaze
  • qik
  • tubemogul
  • 12seconds
  • Audioboo
  • Scoopler

Permalink Leave a Comment

No No November

October 30, 2009 at 22:53 (Film, Games, Tech, TV) ()

It’s nearly November and I was in danger of letting another month pass by without writing anything.  This is therefore a rather pathetic attempt so at least I can say I did write something and who knows, perhaps if I just reduce the gap between each posting this will become a more regular occurrence.

So, last time I mentioned a few things I was going to write about. Firstly Flashforward, at the time I was just going to say how I thought the problem would be knowing how things were going to end it would be difficult to expand beyond one season.  It turns out that it’s a bigger problem because knowing the future eliminates any risk and therefore any investment in any of the characters although considering you probably can’t name any of the characters there are perhaps even bigger problems.  Five episodes in and I’m still curiously exactly how they are going to get to the FlashForward point but I’ve also realised that the whole show is a bit rubbish, enjoyable rubbish though.

On the gaming front I’d been playing Crackdown and Elite Beat Agents, the latter is just weird, almost to the point of being just wrong, but it’s an enjoyable rhythm game for the DS.  My enjoyment of Crackdown went up and down quite a bit.  In the early hours before powering up any abilities I really couldn’t see what any of the fuss was about but then as I gained more skills and started leaping around from building to building I began to enjoy it a lot more.  I think a sign of a good game is when you start thinking about it in the real world and after playing Crackdown for a few hours any cityscape starts to look like a playground.  My main problem with the game was the dull and repetitive missions which just got in the way of exploring and having fun.

Finally (500) Days Of Summer, in a year of disappointing films (for me, anyway) I’m fairly sure that this is going to be my favourite film of 2009.  I just liked everything about it from first hearing the trailer,  the first words on screen (which annoyingly I can’t remember now), the soundtrack and nothing, not even the ‘annoying’ sister, kookiness or ending, bothered me (but I did see the final ‘joke’ coming).  Put simply, it’s the kind of film I wish I’d made and therefore obviously wonderful.

I was also going to write about the PHPNW09 conference I attended earlier in the month, but I had little to add to what has already been said in write ups by loonytoonslornajanemheap and the other comments on the PHPNW09 Joind.in page.  It was a  really great event, the talks were entertaining and informative and if you weren’t there you really should take a look at the talk slides and videos (when they go online).  It also got me in the mood for more conferences and festivals. Tomorrow I’m off to hear Masaya Matsuura and Jonathan Smith speak at the GameCity event in Nottingham and in November I’ll probably be attending the Leeds Film Festival and Keswick Film Club’s Ken Russell Weekend (so long as I’m brave enough to go out in public with my ‘Mo‘).

Permalink Leave a Comment