Rev Dan Catt
All twelve Long Good Reads are available in the Newspaper Club Newsagent, either to buy or cheaper (i.e. FREE) download.

All twelve Long Good Reads are available in the Newspaper Club Newsagent, either to buy or cheaper (i.e. FREE) download.

The Long Good Read #012

Final cover design for The Long Good Read, kinda fun this one, will explain it at some point.

The Long Good Read #012

Final cover design for The Long Good Read, kinda fun this one, will explain it at some point.

In Chicago, surveillance video shows a thief pointing a small box-like device at a car door. Within seconds, the car unlocks, the alarm is disabled and the thief simply opens the passenger door and easily takes the valuables.
nevver:

Power eats the soul
"When the car was unveiled before the eyes of the public at the Tokyo Motor Show 1999, this car was universally scorned!

The audience found the design to be very eccentric and odd that it got to a point that people just didn’t understand it.  It was perceived design wise as toy-like, boxy, and almost anti-designed.  Okay and we do agree on boxy and toy-like, so what’s wrong with that?  Wrongly to my belief, the 021c was quickly hidden away and not shown again.”

(via Migurski /via Ten years later // FORD 021C by Marc Newson | Yatzer)

"When the car was unveiled before the eyes of the public at the Tokyo Motor Show 1999, this car was universally scorned!

The audience found the design to be very eccentric and odd that it got to a point that people just didn’t understand it. It was perceived design wise as toy-like, boxy, and almost anti-designed. Okay and we do agree on boxy and toy-like, so what’s wrong with that? Wrongly to my belief, the 021c was quickly hidden away and not shown again.”

(via Migurski /via Ten years later // FORD 021C by Marc Newson | Yatzer)

Making a network map of Guardian Tags

Using the javascript D3 library to plot tags used on a a weeks worth of Guardian news stories. You kind of have to connect all the nodes, throw them into a box and let the dynamics take care of the rest until it’s settled.

This animation has been speeded up, the whole process from start to settled took just over 5 minutes.

Experimental Pre-“Podcast” Episode minus #2

An experimental pre-“podcast” while I figure out how to use all the tools. I wanted to aim for something like an audio scrapbook, part diary, part field records and part way to capture the children playing.

I’m starting at episode -2 on the assumption I’ll get a bit better with the audio levels by episode 1, some are too low, others are too high.

Aiming for one test “podcast” every two weeks until I hit Episode 1, and then weekly. I’m hoping that by doing it every week I’ll eventually learn how to use the various audio tools I have kicking around.

Featuring…

  • Newspaper Club’s Things Our Friends Have Faved On The Internet.
  • Modesty & I playing a game of Magic:TCG
  • Me heading off into town because the Internet is broken.
  • And a bedtime story with Isobel.
The Long Good Read #011

"Joy Divisualization"

The Long Good Read #011

"Joy Divisualization"

The word polygon only reminded Alice of the night, the face reflected in the man’s arms and dreamed.

I’ve written before about Markov Chains, cut-ups and automated text generation, and about getting algorithms to do work for me in the name of art.

In both cases it’s an attempt at getting unblocked, seeing things differently. Burroughs used cut-ups and fold-ins as a way of uncovering hidden meanings, taking sideways steps, routing around our meat brain way of thinking, of course in his case drugs also helped.

I ended my Markov article with “A Markov chain is a blunt tool, but an interesting starting point”, Tom’s said something similar with his recent Sims “Infinifriends” post.

"Markov Chains, as Leonard has frequently pointed out, are not always the best way of generating text alone, especially when the corpus you’re working from isn’t particularly consistent. He is, of course, right. Still, I enjoy the mental leap readers make in order to make generative prose actually make sense, and for this project, I mainly wanted to get to scripts as fast as I could."

It’s that “mental leap” I find fascinating, how we can see patterns in things and so on, or “Pareidolia”, defined on wikipedia as…

"[the] psychological phenomenon involving a vague and random stimulus (often an image or sound) being perceived as significant, a form of apophenia. Common examples include seeing images of animals or faces in clouds"

But we’re not the only things spotting patterns in noise, we’re now training computers to do the same thing. Facial recognition has come on in leap and bounds. Or rather, we’ve simplified the complexities enough that it can now be done in the browser and various realtime graphic libraries.

The general assumption for face detecting software is that it’s going to be given an image with a face in, and its job is to then detect that face. But what if you start giving it images that are obviously faceless? Like the Onformative GoogleFaces project which presents a series of satalite images to the computer, which then spots faces.

GoogleFaces

"An independent searching agent hovering over the world to spot all the faces that are hidden on earth."

Now what if we could do the same with words?

The Markov chains I have spitting out mixed up text from Jeff Noon’s book Channel Sk1n (and Warren Ellis’ Gun Machine) have been running for some time now. Of course they spit out gibberish but at least it’s gibberish based somewhat on the style and words selected by the original author. Many times I’ve hit the “Remix” button and found little nuggets of gold in there.

And then I spotted TextTeaser a “special algorithm formulated through research to summarize articles." or in other words, a detector of hopefully the most important, interesting and salient points in a block of text.

I don’t know how the algorithm works but we can assume is parses the text several times looking at repeated phrases, weighting certain words more than others, picking out quotes from people, locations and so on. Once again the general assumption is that you’re going to be putting a coherent chunk of text in-front of it in the first place.

But, that’s not what we’re going to do, oh and TextTeaser has an API too.

So on the one side we have code that generates semi-random text, while on the other we have code that’s trying to pick out the most important sentences. You can see where this is going.

  • All Nola could do now was carry on, to stay in pursuit, no matter how she loved their heads and to break you down with dance.

  • His mouth stuck in the street, demanding that people do the same imagery, the eye with sunpowder transfiguration.

  • 'And how often do these managers know, are they contagious?' Nola thanked him and moved deeper into the human spectrum.

  • Her own throat closing as she plucked and sampled from all that she had, that straight-line grimace.

  • 'It's 1998, of course.' 'But why are you saying that the answer?' 'Poor Alice! Wrong Alice!' squawked Whippoorwill.

  • 'Try making your mind up time!' Alice opened this cupboard; a flintlock pistol was lodged within Celia's inner workings.

  • Channel Skin…Channel Skin… Channel Skin… ~~~ Onwards.

  • The word polygon only reminded Alice of the night, the face reflected in the man’s arms and dreamed.

  • She tried to strangle a boiled ham sandwich (with not a bit of fire crackling, electricity burning through her wires).

  • A news channel psychologist thought it very unusual that a third boa snake crept into Noah’s cargo? ‘That extra Sherpent eshcaped from the inside?’ ‘There is indeed…’ responded Ramshackle, reaching upwards towards her death.