So, I’ve been quite busy recently.

I’ve made some decisions in my personal life that have resulted in a bit of a change of direction and focus for me. 

It’s been exciting, and has necessarily meant some changes. This has given me the opportunity to “sharpen my tools”, and so I’ve been getting around to playing with a bunch of technologies that have always been on my “weekend project” list, but they never made it up the priority list.

This isn’t directly related to the title of this article, but provides context. Since, as part of one of these projects, I found it necessary to populate a database with the contents of a bunch of rather unwieldy CSV files (within a docker container, so that their contents could be exposed by a NodeJS/Express REST API, but I digress).

It was while reading the various man pages, before launching in to writing a script, that I found this little gem. This meant I could do the import and provisioning of my docker instance straight from the /docker-entrypoint-initdb.d SQL scripts.

First, create your tables in the normal way, defining the data types that are represented in your CSV file. For example:

CREATE TABLE locations (
   location varchar(128),
   latitude DECIMAL(11,8),
   longitude DECIMAL(11,8)
);

Into which, as you might expect, you’d want to import a long list of location coordinates from a CSV file structured as follows:

location, latitude, longitude
"Oxford", 51.7520, 1.2577
"Edinburgh", 55.9533, 3.1883

Now, in your SQL, execute the following query:

LOAD DATA LOCAL INFILE 'locations.csv' 
INTO TABLE locations
FIELDS
  TERMINATED BY ','
  ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
IGNORE 1 ROWS (location, latitude, longitude);

Which tells MariaDB to read each line from locations.csv into the locations table, skipping over the first line (which contains the header).

This little trick meant I was able to provision my api’s backend quickly and easily, without the need to hack together some arduous import script. 

Hope you find this useful too!

There have been a lot of changes recently with Flickr, and from February, free users with over 1000 photos are going to start seeing their old photos being deleted. Premium membership has also seen a sharp increase in price.

So, this seems like an opportune moment to move my photos off the platform – I’ve got something going on to 3K photos on there, and while I still have the originals, I’ve nicely sorted them into albums, so it would be a shame to lose that.

Previous attempts at writing an import tool connected over the API, but this broke some time ago when Flickr changed their authentication mechanism, and honestly I’ve not had the time to fix it.

Thankfully, Flickr now offers a full data export via your accounts page. This export contains a bunch of zip files that contain all your media, as well as handy .json dumps of all the image meta data. Using this seemed like a much better way than fighting with Flickr’s API again.

Usage

The new tool is a Known console plugin, so unlike the previous tool, you’ll need to install this to your ConsolePlugins directory.

Next, you need to request and download all your data from Flickr. Do this via your accounts page.

Once you have your .zip files, place them in a directory somewhere, where you can access them from your Known install.

Next, execute your import by running the import code from the console:

./known import-flickr username-to-import-to /path/to/flickr/export

Where username-to-import-to is the user who’s stream these photos and videos will appear under, and /path/to/flickr/export is the directory you’ve stuffed your .zip files. 

There is no need to unzip these files ahead of time, the import tool will do that for you.

After you’ve run the import, assuming that there have been no errors, you should see all your photos and videos appearing on your stream!

» Visit the project on Github...

Ok, so earlier I talked a bit about a Flickr photo import script for Known.

Since self-dogfooding is important, I thought I’d point you guys to my new photos site, which uses the data produced by the script’s output.

My script imports all photos, videos and tags, and also stores collections and sets.

Known currently doesn’t have a concept of grouping stuff together (although this might be on the roadmap for the future), so for the time being I’ve stored them as a GenericDataItem object. The site’s custom theme makes use of these to render sets and collections in a sensible way.

Still tweaking, but I’m pretty pleased with the results so far!