Dec 4, 2012

My atoms love me

Do you understand german? Be happy if not. If you do you might be tempted to read this interview with the philosopher Patrick Spät. This young chap is so overconfidently bashing 'physicalism' that he doesn't stop himself from saying things like "already atoms have fundamental mental properties" (ah, I see, it is in the word 'fundamental'). 
I once met an esoterically enlightened person, who made quite a shipload of money claiming that photons obviously possess a free will as they can willingly decide whether to be particle or wave.
Spät might like that idea. Remember, his atoms have mental properties. Mentality, he says, is a natural property just as charge, mass or spin. Sure. Go, measure it. Those properties then add up and the more atoms you have, the more mental you get. Great! What a huge mind my favorite skyscraper represents! And I always knew that my short cousin couldn't be as smart a smart-S as I am - simply because she lacks the number of mentality-building atoms.
Spät then laments that physicists see only mathematically describable cold matter without being able to say what matter really *is*. He has no problem whatsoever with the fact that he himself has no clue what he means by 'mentality' - he just excuses himself 'we cant find out because, unfortunately, we can't talk to atoms'. yep.
It is nice bar-chatter. But nobody can force me to always get drunk before I have to read stuff like that!

Nov 30, 2012

Science-communication - the role of the native speaker

You rarely find high-tech research institutes in the cultural center of a city. You should rather look for them somewhere near a freeway to the airport. The area is then labelled  'tech-campus', 'innovation-park' or the like to help ease the despair of those working in the wastelands. Language-studies, the arts and history on the other hand will be expected to reside in those awe-inspiring old buildings in the touristy areas of town. Some, like the german author Dietrich Schwanitz, are quite clear about the reason: the sciences, he writes, are no good for party-conversation and so they might be useful but they certainly don't belong to the common learning - and, by implication, do not belong to our culture.
Well, Dietrich, no.
Popular access to the field and the use for party-chatter can't be a measure of cultural value. If we take, for example, the wide spectrum of music - from the most emotionally accessible chirp to the intellecutally laden and rather closed theoretical musical constructions - would you want to kick out John Cage's 'Bird cage' for 12 tapes? Simply speaking, going from kitsch to 'serious art' can be classified by the relation between rational access and emotional access. The more rational, intellectually demanding access is possible, the more 'serious' the opus can be. The more emotional and the less rational the more 'kitschy'. Complete kitsch would essentially have only the emotional, vegetative component. A work of classical music would be emotionally accessible but with solid rational (theoretical) components as add-on, which is probably appreciated only by a few but certainly adds to the pleasure.
Now physics.
Emotional access appears almost impossible as the language (math) is not widely understood. Physicists do, however, experience considerable emotional highs and lows in their work - they have both access routes (vegetative and rational).
One aspect of science communication is trying to convey emotionality of the complex topic. When done wrong, science journalism substitutes the real thing by some compensatory hype. A complex story on experimental quantum physics almost inadvertently ends with a line on 'future computers', time-travel or the like under the pretext of a public demanding to know the 'what for' - however far-fetched it might be.
This is wrong.
Communicating science is a translation-process. And just like in the translation of literature, the translator has to be native speaker of the target-language. Translating Borges to german has to be done by a native-speaking german. Translating physics has to be done by a native-speaking non-scientist.
Get it right, the translator of course has to be firm in culture and language of the original work - but she has to know her mother-tongue very well.
While some brazenly demand 'every scientist should be blogging', the real, sensible translations can only be done by the well-educated native 'non-science' speaker.
This translator will be able to grill the scientist on all technical details and figure out the rational but, most importantly, also the emotional access to the story - and then transfer that to the public - of course with losses but without substitute shabang - more true to the real thing.

Nov 10, 2012

Now that we're famous...

It was to be expected. Last month the clickrate on smarts increased dramatically, almost exponentially. Would we extrapolate, every person in the world able to use her mouse-finger (formerly called index-finger) would be clicking our site around March 2013. We are famous. And fame obviously spells influence in the digital world (naturally: clicks-fame-influence).
I am excited to see that 90% of our visitors last week came from Ukraine - we should definitely think about adapting to this demographic development by changing to russian: nastrovje (oh, how predictable. Yes, sorry).
Let me give you a glimpse of our main visitors: drocherof, bibikablog, fermersovet, haliava, infoscript, kinorubej, kinorubrika, lovejewel... I would never have guessed that those were interested in crosscultural debate! Especially since they sign up as being robots.
Well, the world is changing.
We shall auto-generate our posts. Then this could be a feedback-loop of writers and readers, the language could change to Assembler-code and nobody would really have to bother.
We are working on it.
But first I have to find a way to kick those robots' butts! §$%&!

Sep 21, 2012


I am not the person who get's to go places. Usually I am sitting in a damp office somewhere one or two floors below the basement of an unbelievably ugly office-building. So I don't have to think whom I could ask to water my plants. Which is good, because I neither have plants to water nor friends to ask.
But you might.
And you certainly solved that problem. But you know what? Your plant also needs light - sunlight if possible. Will you ask your neighbours to move the Hibiscus around your apartment while the sunlight wooshes through? No, you say, watering will have to do it. 
But there are people thinking seriously about that problem - and thinking hard they solved it. 
I bumped into those guys when I rediscovered the treehuggers.
I had almost forgotten them. Treehuggers, you say? I know. Me too. BUT. There is this one website, that I once ran into, when I read about the carnivorous robots that get their energy from digesting anything from fruitflies to your favourite puppy. They were covering that those days - who else did?!. And today I went back to their site and, bingo!, they are describing a Robotic Plant Drone that moves your houseplants to the sunny spots for you.
Exactly the kind of technology that can be described as disruptive rather than iterative. The type of daring science we need to push the frontiers of knowledge.
I just wanted to let you know.
In case you have plants, and vacation, and - no friends.

Sep 20, 2012

One world is enough!

A friend of mine dated a girl who was an identical twin. She and her sister suffered from multi personality disorder. He finally left them - all seven.
And led a happy life with the remaining four. They married when she found a doctor who freed her from her demons. Some of them. And he died. Widowing two. They tried to console eachother and never married again.

Sep 19, 2012

Risk-aversion kills innovation

The number of publications and citations, possibly rescaled into more complex relations like the Hirsch-index or fashionable derivatives thereof, are widely accepted parameters to quantify scientific quality.  In times of scarce financial resources, it is argued, transparency is imperative for allocating funds, and substantial investments in science are best legitimized by ,excellent and useful‘ research results.
This is lead by the perception that scientific quality can somehow be objectively measured and the whole process of 'doing science' can ultimately be subjected to some sort of controlling.  While the drive for excellence and usefulness is agreed upon - their definition and measurability, however, is at the center of many a heated debate.
At first sight, benchmarking usefulness translates into a short time-to-market of the research results, general application-orientation and product-driven applied research (a term coined by the german philosopher Juergen Mittelstrass).This rather economic understanding of scientific value is bemoaned in a desperate note by Abraham Flexner: „We hear it said with tiresome iteration that ours is a materialistic age, the main concern of which should be the wider distribution of material goods and worldly opportunities“ - and that was 1939 (The usefulness of useless knowledge).
If usefulness equals monetary return it is worth while looking at the most fundamental and academic research endeavours of the highest quality.  Scanning the Nobel prizes in physics of the last century turns up a majority of science that is predominantly curiosity-driven and that was of pure academic interest at the time it was undertaken. Today, however, the market-value of x-rays, radioactivity, electron-rays,  x-ray diffraction, nuclear fission, and of course semiconductors can not be overstimated.  The annual return on investment of semiconductor industry - including all spin-off markets - is gauged at around 10% of the worlds GDP. Every one of these fundamental discoveries opened markets worth billions and billions of dollars, dwarfing the return on investment of the ubiquitous 'mp3-code' that is quoted as one of the more successful patents from applied research in Germany.
Product driven application oriented research ultimately encourages iterative optimizations well within the  borders of the known. Fundamental research, on the other hand, has the potential for real disruption and a leap in technology - the basis for innovation. Only together technological advance is achievable.
As obvious as this might be, research funding is focusing on the planable, forseeable - and this can be most easily spotted at applied research. The common research project demands for milestones, intermediate reports and justifications if goals are not reached - driving grant-applications into the mainstream. If the results are predictable, if the milestones are reachable, if the project is rather risk-free an application looks promising to take the hurdles of scientific refereeing and the funding agency‘s grant officers. 
But isn‘t the unpredictable and rather frightening wilderness of the unknown where innovation lingers?

Aug 15, 2012

Altruistic egoism

(it's ferragosto. Everyone is at the beaches. The cities are deserted. The espresso-machine is operated by some students from Australia. It is too hot to think, let alone write. Period.)
What do you think when you see such a twitter-profile "Researcher, therapist, artist, writer..." - decorated with a lascivous-looking long-haired chick? Well: You believe because you doodled some almond-eyed fairies on a piece of paper that you are an artist? Your "dear diary..." makes you a writer? Endless chatter with your girlfriends about their messed-up relationships made you a therapist? And clicking through wikipedia warrants the title "researcher"?
The net is full of those characters.
People seem to want to label themselves. We all want to stand for something. We want to brand ourselves:"Researcher, therapist, artist, writer...". But even if it looks like it: life, even life on twitter, is no computer-game. No matter how much energy is put into self-branding, it has to be backed up by hard facts at some point. On the net you can survive a bit longer with the usual smoke-screens. There are simply some billion people around. A few hundred or thousand will always follow you. It is the quantum-noise of the social fabric.
A wonderful example for the opposite approach is James Altucher. He seems to strip himself of any labels that might be interesting for the outside world, simply produces a deluge of rather egoistic thoughts - and leaves the branding to others. I have my difficulties with his blog - I actually started reading it instead of drinking 12 Espressi, just to get my pulse up and flood the systems with adrenalin.
This was when I had him in the advice- and self-help-drawer, which I, frankly, can't stand. At all. No matter, who writes them. But I learned to read Altucher as an individual. Ignoring, what I'd rather not read and enjoying his biting and hilarious style. I understood: he is practicing altruistic egoism.
And just as I type these lines, James comes up with an article on 'un-labeling'. In his very James-Altucherian style. Enjoy the last lines there:"what is left? You. You're left. I'm right"

Jul 13, 2012

No theory - no money!

A neuroscientist I was talking to recently complained that the Higgs-research,even the Neutrino-fluke at CERN is getting humungous funding while neuroscience is struggling for support at a much more modest level. This, despite the undisputed fact that understanding our brain, and ultimately ourselves, is the most exciting challenge around.
Henry Markram of EPFL in Switzerland  is one of the guys aiming for big, big funding to simulate the complete brain. After founding the brain institute and developing methods to analyze and then reconstruct elements of the brain in a supercomputer he now applies for 1.5 Billion Euro in EU-funding for the 'flagship-projects' of Blue Brain -and many believe his project is simply too big to fail. Some call the project daring, others audacious. It is one of the so very few really expensive life-science endeavours. Why aren't there more like that around? Why do we seem to accept the bills for monstrous physics experiments more easily?
Is what Markram and friends are doing so different from the ventures of high energy physics where billions are spent to let particles collide and smash them to smithereens? Are those scientists all just crazy?
Maybe they are.
But there is one difference between accelerator experiments and projects like Blue Brain:
The particle folks have a theory. They have a pretty good working theory that helped explain a vast amount of observations - and more important than that: the theory allows for verifiable (or falsifiable) predictions. One of the predictions is the existence of a Higgs-Boson. Whatever that particle is or means to physicists, it is a predicted particle with properties that can in principle be measured. It is possible to firmly argue for funding.
I fail to see such a theory in 'consciousness'-research or in brain-modeling. Does anybody have a theory about what consciouness is? Not just a hypothesis, not a vision - I mean: a theory. If there was a theory, there would be the possibility to prove or disprove it.
Up to now *all* experiments on consciousness or the self depend on a human being *reporting* some internal states of herself. Today you could not experimentally verify whether or not your coworker, your cat, your computer has an individual self - if they don't communicate with you about it.
Bluntly, that is not science.

Jun 11, 2012

How to kill innovation

In a talk on innovation at the Berlin-Brandenburgische Akademie der Wissenschaften Juergen Mittelstrass, one of Germanys great living philosophers, added one more slot to the well-known classification of science, leaving us with:  'fundamental research', 'application-oriented research' and 'product-driven application-oriented research'. While he was obviously trying to grant some innovative-potential to research formerly known as 'applied', it soon became clear that innovation would be found rather in the first than in the last box and the terrain for the unexpected was shrinking as it is obvious that innovation by nature is nothing you can plan for and the application-pull would lead to optimizations, inventions, solutions but not to surprises.
His critique of todays ever-growing emphasis on applicability in research-funding was massively amplified by Harald zur Hausen (Nobel-prize in medicine 2008), who reminded everybody that his ground-breaking results were never possible would he have been forced to do that agonizingly predictable research that is on every politicians agenda today. Under the guise of the responsibility to 'give something back to the taxpayer', funding schemes become more and more excel-sheet-driven. The demand for  fine-grained 'milestones' with clear 'expectables' and costs over the whole funding-period (of typically three years) already at the time of grant-application results in scared 'chicken-science' without much space for the unexpected that is one hallmark of innovation. This 'accountability' (that is measured with static parameters such as number of publications, patents, invited talks,...) together with the need to get results that are common-sense in a tightly knit scientific community is killing creativity, innovation and ultimately the return on investment of taxpayers' money, eliminating the central argument for the bureaucratization of science by science-accountants.

May 2, 2012

Consciousness has left the building

The neat thing with consciousness is: it is so undefined that everybody can speculate wildly about it. You could locate your conscious self in the paw of your dog, your aquarium, your pinky... anywhere - and write books about it, sell books about it - thousands! It is just so heartwarming to chat about consciousness, to bash science on the way and to patronize.
And who does it best? Right, the aggregators at
Megan Erickson asserts us (by quoting Alva Noë ) "just as love does not live inside the heart, consciousness is not contained in a finite space". We should not look for it inside our brain, or even our body - but in some intricate interwovenness of our cells and the outer world. What is the proof? None. Just pure sci-fi, touchy-feely chatter. Nice and maybe right or maybe wrong...
Do you remember the first step for explaining, for proving or disproving something? Yep: have a hypothesis. Write something down. And then write how you (or anybody else) would *principally* go ahead investigating experimentally. It is not about designing a real, feasible experiment. It is about devising a principal approach.
Look, do you have any idea how to *prove* whether your coworker is conscious? We assume that she most probably is - by analogy. But is this proof? Or your dog? Looks cute to some - is he conscious?
The debates about consciousness and free will are the big debates over centuries. To claim that anybody even has a clue where to look for it is too fast a conclusion if not even the definition is clear. We could, however, check the big-think-idea reversely: if consciousness is not located in our brain but is rather the consequence of interaction between our cells and the surrounding, then a strict modification of our surrounding should modify our consciousness significantly. Is our conscious self different when we are sitting in a cafe at a plaza in a nice city compared to a situation where we are lying on our bed in a completely dark, small, sensory-deprived room? Well, not really. On the other hand - if we significantly modify our grey matter (by pouring alcohol, deep brain stimulation, or even, well (don't do this at home) removing it....), I believe our consciousness is markedly altered putting some weight to the importance of 'brain' for the existence of consciouness.
Megan Erickson concludes "It's okay to speculate, Noë seems to be saying, even if you're not a genius." Well, this, clearly, is an approach some over there value highly. But why claim then, this would be science?

Apr 22, 2012

Lies es nicht! A reminder of consumer-power

April 25 is the release-date for a short little electro-booklet that will pound us to dry out trash by simply not supporting it. The idea is simple: consumers have the power to get what they want and to get rid of what they don't want: Don't read it! If we don't read the awfully badly written advice-literature, then it will not be printed. If we don't click cyber-trash it will diminish. All that stuff works because people pay, directly - or indirectly by driving ad-prices through click-rates. Don't buy what you don't want. The german author Sina Hawk reminds us of that simple power in her booklet "Lies es nicht!" ("don't read it!") - it will be available at amazon for about 1.25 Euro. And since it is in german, I continue in german :)
Ein Kilogramm Hackfleisch, gemischt, für 4 Euro. Billig. Saumäßig billig. Für diesen Preis muss man die Qualität vergessen - und wie man dieses Fleisch 'produziert', wie das Tier  aufwuchs, das möchte man sich nicht ausmalen. Ein schrillrosa Puppenwagen mit Glitzer und zuckersüßem Plastikbaby für 12.95 - der Materialpreis liegt kaum drunter. Geschmack und Qualität zu diesem Preis? Unter welchen Bedingungen lässt sich das herstellen? Von Kindern für Kinder. Kauf es nicht! Das Ozonloch - die Industrie muss was machen! Kriegspolitik meiner Regierung - die Politiker sind schuld! Überproduktion für meinen Supermarkt - die Bosse sind zu gierig! Bösenspekulationen, die ganze Länder ruinieren - Bonzen in den Knast! Blutrünstige Nachrichten, Zeitungen, die an Perversionen in den Nachrichten verdienen. Üble, geldgierige Produzenten. Es ist so bequem, wenn die anderen, die größeren, die Unerreichbaren die Schuldigen sind. Man kann klein, märtyrerhaft das Übel anprangern wie all' diese unsäglichen Verschwörungstheoretiker und Sekten und ist so bequem hilflos und verzweifelt.
Aber Bewegung kommt sofort dann ins Spiel, wenn DU etwas tust - oder in diesem Fall: nicht tust! Kauf es nicht! Wähl die nicht! Gib dieser Bank Dein Geld nicht! Iss das nicht! Und einfach mal anfangen mit:
"Lies es nicht!" wie die Autorin Sina Hawk ganz klar und einfach in ihrer Schrift, die am 25.4. bei Amazon erscheint, fordert. "Lies es nicht!" fordert auf, den Dreck, den wir gedruckt oder elektronisch serviert angeblich lesen müssen und beklagen, schlicht zu boykottieren. 
Ihre Zeilen sind ein Aufruf, ein Pamphlet. Wer spricht noch vom 'Information Superhighway' wie das Internet in den Neunziger Jahren genannt wurde, wenn man die Flut von Unsinn, Müll, Überflüssigem und schlicht Widerwärtigem ansieht, die es heute transportiert. Die Stammtischphrasen waren zwar schon immer die gleichen, aber früher war die Reichweite geringer. Die Dumpfheit blieb im Sumpf am Stammtisch stecken und war am nächsten Morgen mit dem kollektiven Filmriß im Zweifelsfall auch wieder vergessen. Hartnäckige Eigner dünstender Hirne mühten sich bis in die Leserbriefspalten von Tageszeitungen - aber dort wurde editiert und gefiltert, was den Leser erreichte. Heute hat prinzipiell jeder Zugang zur denkbar weitestreichenden und nahezu ungefilterten Kommunikations-Plattform - dem Web. Dort kann man alles bekommen. Alles. Man muss aber nicht.
Denn die Währung im Web sind Klicks. Die Klickrate ist wichtig für das Ego und sie bestimmt wie teuer Werbung platziert werden kann. Die Klickrate zählt. Sie bringt Euro oder Eitelkeit. Muss ich ein Hinrichtungsvideo im Netz wirklich anklicken? Nein! Muss ich einem Link folgen, Unfug kommentieren, Schrott überhaupt zur Kenntnis nehmen? Nein! Lies es nicht! 
Wir haben die Macht, erinnert Sina Hawk. Kauf' keine Zeitung, keinen Ratgeber oder Roman unter Deinem Niveau. Kauf das nicht, finanzier' die nicht. Lies das als schnellen Aufrüttler:"Lies es nicht!", Sina Hawk, Amazon - ca 1,25 Euro. Ab 25.4.

Apr 18, 2012

Invention, innovation and carrier pigeons

We live with the bold categorization of research as being either 'fundamental' or 'applied'. The emphasis in funding - and broadly in the public understanding - is on the supposedly more valuable *applied research*.
Scientists engaged in fundamental research, on the other hand, are widely seen as geeks, as nerds in ivory-towers of academia, kind of wasting taxpayers' money for their personal entertainment, dabbling with expensive machines, finding ultrafast neutrinos and dismissing them again...
At the same time innovation is imperative. So innovate we do. All the time.
But seriously, what kind of innovation could we expect when we are asked to do research on optimizing the rubber of a tire, or if coerced to develop a better mp3? What can we expect if somebody pays our research to
make cars more fuel-efficient? Certainly there would be some neat progress. Some nifty inventions. But innovation?
Let's look back. What would we have gotten when, 30 years ago, we would have researched how to make light-bulbs smaller and less energy-hungry? We might have gotten smaller light-bulbs, with some trickery inside. Would we have light-emitting diodes today? Probably not.
Anybody remember those tubes in old receivers? Doing applied research on them would have led to what? To the invention of a transistor?
Would we have discovered and understood x-rays by looking for methods to study bones in a living organism early 1900? Or take electromagnetic waves. Are you sure we would have encountered them during our quest to invent some type of long-distance communication 100 years ago? I bet carrier pigeons would have made the race.
To put it the other way around: what fundamental discovery concerning laws of nature did *not* result in a multiple gazillion dollar market? (yes there are a few - but you get the point). The danger of doing research in a system that demands predictability of results, as well as clear, controllable project-plans with well-defined milestones and a written concept for IP- and technology-transfer at the end is that you mostly get exactly that: predictable results.
Don't get me wrong, those results can be great, helpful, important. But the request to do only the predictable will yield the predictable - at best. Stubborn application-driven research yields incremental inventions and a predictable but very limited return on investment.
The real disruptive stuff can not be planned - by definition. If you aim for real invention it is necessary to look for some deep understanding of nature - and then get somebody inspired to create an application from that. As also industry is demanding to be 'less nice' if you wish to be innovative, science should withstand the pressure to serve as an incremental problem-solving machine.
The return on investment will be much bigger.

Mar 26, 2012

Information obesity? Don't swallow it!

Great - now they call it 'information obesity'! If you can name it, you know it. My favourite source of intellectual shallowness,, again wraps a whiff of nothing into a lengthy video-message. As if seeing a person read a text that barely covers up it's own emptyness makes it more valuable. More expensive to produce, sure. But valuable?
It is ok, that Clay Johnson does everything to sell his book. But (why) is it necessary to waste so many words, spoken or written, to debate a perceived information overflow? Is it fighting fire with fire? It is cute to pack the problem of distractions into the metaphore of 'obesity', 'diet' and so on. But the solution is the same. At the core of every diet you have 'burn more than you eat'.
If you cross a street, you don't read every licence-plate, you don't talk to everybody you encounter, you don't count the number of windows of the houses across, you don't interpret the sounds and noises... No. Guess what: you focus.
There it is: the stuff in the internet is not information - it is data! And you just go ahead and focus. It is as easy as that.

Mar 13, 2012

Popularize science? - Dare to write what you know!

"Play what you know!" An actress who has to play a raging serial killer need not be a serial killer herself (it might even be contraproductive in some way... maybe a bit messy on the set). Method acting tells her to get the emotional framework as close as possible to the feelings of a serial-killer - by re-enacting emotions she relates to (remembering the guy, her first boyfriend left her for would be an example).
"Write what you know!" is one piece of advice for authors that is all too often misunderstood as Jason Gots, associate editor of BigThink, points out. Authors must not restrict their prose to retelling their own (very possibly boring) life - they should map their real-life emotional experiences to the world of their fictional characters. Never been to Mars before? Well, you probably visited some decaying neighbourhood in Detroit or Bilbao. I am sure you found some Martians there. You know how your fictional character feels as he leaves mothership. The same is true for science-communication.
Science-communication is not among the greatest strengths of the average scientist. Some do it brilliantly, some barely speak even in 'real life', some try, but shouldn't. It is wrong, to demand that 'every academic should be blogging'. 
Science communication is important. The transfer of knowlede is imperative. The lack of good, adequate science communication is one of the reasons why science has a comparatively weak standing in everyday's culture - despite it's omnipresence. It is 'chique' to state with a smirk 'I never understood physics - and always flunked math-tests at school'. In contrast it would be cultural suicide to describe a classical concert as 'loud but I really don't care what that fiddling is all about'.
Without any research to back that up, however, I would presume that the percentage of people, who understand Ohms law (of resistance) and the number of those who understand a musical composition is equally low.
Conventional 'high culture' gives access through emotions, feelings, opinions (making it possible for a wide public to chatter about it) - while it is rather unconventional to have an opinion on Ohms law. That access to the arts, however, is equally incomplete and inadequate at the end. Both, science and the arts, have an additional level of complexity that is not accessible to the layperson. Music is done no justice if it is reduced to mere sentiment.(You might enjoy the german text "Wissenschaft ist keine Kunst")
Real science can not be slapped on the untrained public. It needs real science communicators to translate in a useful way. Science communication consequently has to be done by well-trained science-communicators, while scientists have to get training to 'talk to' those communicators. The journalists in turn will be fit at asking the right questions and then use their 'method-acting' to entertain the public with a meaningful projection of emotions the audience can relate to onto the science they chose to communicate.
Science communication goes wrong if a substitute-emotion is communicated instead. Just as if that method-actress would chose to impersonate her ex-boyfriend instead of transforming her rediscovered feelings for the role of the serial-killer, it is wrong to write about "time-travel and warp-drive" when scientific work on problems with the interpretation of relativistic effects in an neutrino-experiment was the issue (... is anybody still following me?).
"Write what you know", in this context, means for scientists: when they communicate science they should attempt to transfer their very own emotional experiences to the world of the reader. Some are really good at it!  "Write what you know" tells the science-journalist to adapt his storytelling power to the scientific story. As a method-actor would do.

Mar 8, 2012

The scent of money - the scent of sulfur - the value of art

I just returned from a brief chat with a friend. Clemens is an artist of whom you will hear by the end of this year - a lot. I was sipping a beer in his crammed east-berlin soviet-era mini-flat, 'inhaling' as much of his wonderful paintings as possible; the intensity of his life beaming of every square-inch of color- and text-plastered canvas. He shares his last bottle of beer with me - because he wants to celebrate the occasion: his art just attracted the serious attention of a very, very important public figure, who already decorates his office with one painting by him (smack between a work by Immendorf and one by Lüpertz). That guy has a plan for a major coup d'etat involving Clemens' art - and it will benefit both. 
Up to now Clemens lives from collected bottles, some paintings he sells at insanely low prices and petty crime.
Now he is about to jump into major league.
His paintings have the expressive power that makes collectors nervous and renders some pieces of modern art so invaluable. He just can't not paint - even if he has to go steal the paint and snatch the brushes. He is driven. He often has to chose between eating or painting. And painting almost always wins. But now he gets the backing of a major player in the markets. And this will change the price-tag on his work. It would have changed the price-tag on any mediocre scribble of many a non-talented would-be artist.  And this is the scary part: one shark decides to promote an artist. And the ailing artist survives. If he decides to drop him, he is gone again. 
As I leave, he grabs a plastic-bag and stuffs three oil-paintings inside - a present for me: "you will love those! You shall have them!" He never had a sense for commerce. I hope he survives the devilish love of his new-found promoter.
And long after I left I still smell the scent of sulfur that seemed to be lingering in the flat.

Feb 27, 2012

Academics should be blogging? No.

"blogging is quite simply, one of the most important things that an academic should be doing right now" The London School of Economics and Political Science states in one of their, yes, Blogs.
It is wrong.
The arguments just seem so right: "faster communication of scientific results", "rapid interaction with colleagues" "responsibility to give back results to the public".
All nice, all cuddly and warm, all good.
But wrong.
It might be true for scientoid babble. But this is not how science works.  Scientists usually follow scientific methods to obtain results. They devise, for example, experiments to measure a quantity while keeping the boundary-conditions in a defined range. They do discuss their aims, problems, techniques, preliminary results with colleagues - they talk about deviations and errors, successes and failures. But they don't do that wikipedia-style by asking anybody for an opinion. Scientific discussion needs a set of non-negotiable attributes: there has to be competence, trust and reputation. That's one reason why scientists meet at conferences. You need to know whom you are talking to. You have to be sure you can talk freely about difficulties without spilling them out into the public. You need to know the competencies of your colleague - her strengths and weaknesses.
Science is not just opening the doors for input and flooding the public with data. Science simply does not work that way.
Patrick Dunleavy and Chris Gilson complain in that blog that scientific discussions happen "in language where you need to look up every second word in Wikipedia". Dear! You can not look up scientific language in Wikipedia. Look, the language of physics is in large parts: mathematics. And it is so not to exclude Justin Bieber fans from tweeting about it - science uses language-extensions to express things that can not be talked about in everyday language.

"Twitter is a huge supplementary help, in forcing academics to communicate key messages in 140 characters!" Between scientists? What a terrible idea! This does nothing more than reducing the complicated and complex findings of research to flashy, glittery nonsense. We have that all the time: a group of physicists does experiments with single atoms close to absolute temperature to check on a principal law of physics. What does the public-relations guy demand? At the end there has to be a sentence about the relevance of that research for the computers of tomorrow. The relevance is zero.
The public is not stupid. It is simply not true that they want to have *anything* back - as long as it is entertaining. Look at the nobel-prizes for physics of the last 100 years. All that basic research stuff on x-rays, electromagnetic waves, semiconductors.... Nobody would have tried to dumb that down to some catchy 140 character nonsense that 'the person on the street' understands or feels good about. It took decades - many decades - to be relevant to the public. But then it had the most impressive impact. What multi-billion dollar markets all that became! Again: x-rays, electromagnetic waves, semiconductors.
Ok, the authors of that 'five minutes with...' blogpost speak for social sciences. But I am certain social scientists disagree as well.
It is good policy, of course, to think about dissemination of scientific results to the broader public - and any media is good for that. But there certainly is no need for extreme speed or extreme brevity. The findings have to be translated for non-scientists, sure. But we all are aware that the amount of  truth and relevance that get's lost in that process has to be kept at a minimum.
Science-communication has to be done by professional science-communicators. They live at the interface between public and science. They are translators. In no way is blogging "one of the most important things that an academic should be doing right now".

Feb 24, 2012

Understanding is an evolutionary advantage

Already in the 30's of the last century it was observed that an injured fish can trigger a fright-reaction in the members of his school.
Nobody really understood why or how this was communicated but it was speculated that some substance must be released that instills fear in others. That substance was adequately called "Schreckstoff" (german for 'fear-stuff'). And, indeed, injecting skin-samples of an injured fish (well, how could he then *not* be injured?) into water, scared the §$%* out of the otherwise relaxed co-fishes.
Up to now the chemistry behind that reaction was unclear.
Suresh Jesuthasan of the National University of Singapore and coworkers have isolated one component (the glycosaminoglycan (GAG) chondroitin) from Zebrafish that turns out to be important as messenger . While the evolutionary advantage of Schreckstoff for the survival of the school is obvious (run!) the fate of the injured fish is sealed when he is left alone - showing that evolution developed the use of this warning-signal on the receiver-side, while the agent is possibly released unvoluntarily and aimlessly by the prey. This is consistent with observations that predators are attracted by the same substance that is triggering fear in others.
In the meantime proponents of a commercial view on science already discuss how this substance can be used to keep unwanted fish away from the farm or to attract others - but  it is quite fortunate that you just asked what that has to do with Brad and Angelina. Well, you go and figure out!

(for the faint-hearted readers: Suresh Jesuthasan also researched what mechanisms are involved when that scared bunch of  Zebrafish calms down again - envisioning new medication for fear-reduction. Anyone speculating that the military might have a benevolent eye on that type of research?)

Feb 14, 2012

If Brad Pitt is a Zebrafish then Angelina Jolie is not

Two Zebrafish on a date.
Foto from IGB, Eva-Maria Cyr
You are probably not among those who subscribe to the newsletter of the "Leibniz-Institute for Freshwater Ecology and Inland Fisheries" (IGB) - but you probably should. Their recent press-release (in german) is a real eye-opener - it has potential to completely change my bar-life.
Scientists at IGB devised an experiment code-named wedding-planner in which they check which male Zebrafish get's lucky on a date.
The result is nothing short of stunning.
If the girl-fish gets to chose between a number of differently attractive guys she does not go for the most attractive stud but the second-best looking. Reproducibly.
The reason is, they found in a monogamous setup, that the super-guy tends to bully the female zebralette into submission, which kind of spoils the party. Quite reminiscent of what we observe amongst humanoids. Too bad that neither Zebrafish nor those brawny bar-peacocks have enough brains to read and understand the study.
(Since Brangelina are here in Berlin right now during Berlinale I might get a chance to check for fishlike features of that supernatural couple again - maybe it is not Brad, who is the fish.)

Feb 9, 2012

The software that will earn you admiration, gratitude, and real money

Electronic media once seemed destined to reduce the use of paper. This certainly environmentally attractive idea is proven wrong daily in any office or home trash-bin. It simply doesn't work. We seem to need the paper. This impression of paper-crave is backed-up by studies that show an ever-increasing paper-production (and accordingly -consumption).
We are doing ok reading a book on kindle, pad or laptop. It is fine for the subway ride to work. It works for leisurely reading a finished text. But if you have to thumb through a financial report of your institution or the first draft of a thesis of one of your students, you want to have it printed out. On paper. You rummage for a pencil.
We need the haptic of paper, we want to spread the sheets all over the table, jump from page to page, change the order, scribble,…
Why can't we stop printing what we see on our screens? Correcting, annotating, highlighting seems just more natural when done with pencil on a sheet. The ever-increasing number of document-modification software that allows for quite nifty scribbles and conversions on your screen-displayed documents is nice but it has not significantly reduced the urge to print.
Ok, so we print.
Once printed we scribble on our documents and then what? The stuff has to get back into our files. Some hand it to the secretary to deal with it, others scan, convert, retype… this is not, well, ergonomic - to say the least (it is outright annoying).
One application will pop up soon - if anybody just codes it. I still wonder why it is not on the market yet. People would kill to get it!
As I don't find time or skill to code it myself I just give the idea away and hope for the best (if one of you wants to write that software, and is able to do it, contact me - we will work something out along the line of: just mention me favorably and give me a free copy, willya?).
You've heard of the input-device that traces what you write with a propped-up pencil on conventional paper and allows you to import it into standard graphics-software (Wacom is promising to have the so-called Inkling out in March)? It uses some infrared and ultrasound sensor that you clip to your paper-notepad. So when you write on that paper you end up with two versions of your scribbles: one conventional (ink on paper) and another, well, also conventional (bitmap, vector-graphic, jpg,… in your computer).
Why not combine that with your document-reader?
You read a manuscript (a pdf, maybe) on a screen-device. Then you decide to really work on it and print it out, because it just feels more natural to work on a paper-version. To that print-copy you clip the scribble-sensor, do some clicks to have orientation and size of the document right and then you start making your comments and alterations with real ink on real paper. What you do there, however, is recorded and transferred to the original file, creating a modified electronic copy. Either as hand-written annotation overlay to your pdf - or converted to text by some text-recognition software.
By moving to the archaic work style you retain all your creative energy from working on real paper and with the scribble-interface you skip the burden of transferring your work back to the electronic document. And your retro-boss, who still only reads emails when they are printed out, wouldn't even have to know that she is modifying the original word-file while she fiddles with the pen on paper. Thinking of it, she may even send the handwritten letter as email by ticking a box on her paper-form...

Feb 2, 2012

Justin Bieber falsely correlates with Influenza

Just now we got aware of a scientific paper by Aron Culotta (2010) evaluating data from The U.S. Centers for Disease Control and Prevention (CDC) on Influenza Like Illnesses (ILI) and specific influenza-related keyphrases on twitter (flu, cough, headache, sore throat...). The correlation of twitter-based predictions of ILI-devlopment (after a training-phase to optimize the algorithm) with real data is amazing, giving proof to the concept of data-mining from social-media streams. While for a variety of analyzed phrases the results were comparably good, there is a word of caution from the authors These results show extremely strong correlations for all queries except for fever, which appears frequently in figurative phrases such as “I’ve got Bieber fever”.
Besides the beauty of the demonstrated algorithms the paper gives a helpful overview of fundamental literature in this young field.

Feb 1, 2012

Pasta - e basta!

As you keep asking: this is my pasta. Handmade by me.
Photographed by me. Eaten by me.
Let's forget what we just learned (it is the carbohydrates, not the fat that makes us (them) fat.
Wonderfully explained in the infographic of the day on fastcodesign).
Pasta is not bad for you at all!
The unrivaled Maria Popova from just circulated a breathtaking review of 'Pasta by Design' - an extremely ambitious and obviously beautifully illustrated book analyzing with rigor the geometrical shapes of almost 100 different types of pasta.
(and I was proud of being able to identify eight!)
Most importantly it is stated already in the introduction that pasta is made of durum wheat flour and water. Pasta!, um, Basta! The designers would never even attempt to touch any of these egg-infested derivatives or supposedly ecological or healthy experiments with rye or spelt flour (yes, I had to look this one up).
To the trained cook and passionate gourmet it is clear that different shapes of pasta serve different purposes: the intake of sauce, the bite, the haptic ... all depends on the correct shape.
While the book then goes on to classify that amazing variety of noodles by following the science of phylogeny (building a family tree based on morphological similarities), the highlights are the mathematical descriptions of the individual species. Their simulation and graphic representation side by side with food stills of simple beauty.
To some the juxtaposition of mouthwatering food and scary mathematics might be too much to bear but some could get an inkling of what mathematics is doing when employed in natural sciences: far from claiming to accurately describe nature or to even break down nature into something cold and constructed, it illustrates the desire to find words for observations that go beyond "oh nice!", the sensation of mimicking nature. And maybe one or the other non-mathematician gets the flavor of what scientists are talking about when they describe a model or a theory as 'elegant', 'beautiful' - and thereby more probably 'true' than others.

Jan 30, 2012

Write a book!

As the second highest authority - the pope - is now fiddling with social media and feels competent to give advice (see "you don't have to be brain-dead to give advice" and the comment "but it helps"),
it might be good to step back, look at all the soc-med-mess, take a deep breath and ask yourself:
is this what I wanted to get involved in, am I gaining something? Anything? Am I wasting my time?
There are studies in abundance showing how much intellectual potential is blocked by wading through the net in search of information, people and networks.
Since most of the air-brained blog-posts out there were written with the hope to get attention, build a following, and to get heard: written to build a reputation, it looks a lot like a big room full of kids yelling, jumping, kicking and scratching to get noticed. But while they all scream their lungs out - this information-inferno is just numbing.

This is where intelligent filtering sets in.

We look for content. Original, 'manufactured' content. Something that is an intellectual, artistic, emotional output of a real author. Social Media still have that scent of snake-oil around them, because they were highjacked early on by sales- and marketing-people. In the beginning there was all this trading of followers, the SEO of blogs, manipulation, trickery, pure magic and, yes, snake-oil, to get as huge a footprint as possible on the net.
The discussion about the future of classical publications, printmedia, cinema etc. helped rethinking content again. The difference between journalism and googling becomes obvious.
I was thrilled to read the testimonial for rock-solid content by one of the seniors in the pond of social-media-sharks, James Altucher, who manages to attract and entertain a huge crowd by delivering unique content and skillful marketing. His latest entry summarizes pretty well what counts in the struggle for net-reputation: if you want to get noticed and *stay* noticed, produce real value.
Write a book! (damn! - as James would add)

Jan 10, 2012

Augmented depression

If you are about to wallow in a moment of tearful self-pity, I highly recommend enhancing the experience by this little sound-track: "Why me?! - Again!"

- enjoy.

The mining of crowd-sources

While some are wondering why scientists appear not to appreciate tools like Twitter to communicate , there is more proof for the value of the meta-information that can be plucked from the stream of micro-utterances.
Roughly two years ago we speculated about possibilities to extract (useful) crowd-information. Increasing mentioning of umbrellas/rain - together with localization -, for example, could give valuable input to the weather forecast. As we put in 'Meta Mining':"If the noise of individual utterances will be systematically analyzed for overlying macro-structures and for phase-transitions from the purely random to the organized, there will be more information gained than individually and knowingly put in. The sheer boundless chatter of Twitter and alike corresponds to the cells, the web is the organism." We were encouraging to step back and look at structures rather than the individual tweets.
In a recent report in "The American Journal of Tropical Medicine and Hygiene" that is reviewed in Nature, scientists show how analysis of Twitter-messages would have been a quick way to detect and track the deadly cholera outbreak in Haiti - simply by looking at the number of 'cholera' posts on Twitter. They found a stunning correlation between the official number of cases and the volume of chatter related to that.
This is only one more - scientifically proven - example for the potential of the data deluge.
It is a matter of time until publicly available analysis-tools mine crowd-sources like twitter (or even de-personalized sms…) for real-time input to forecasting tools.

Jan 1, 2012

Google's personalized ads kill my relationship

As I sit cheek-to-cheek side-by-side with my wife, we both working on our very own projects at our very own computers and she starts telling me about some coding-trick she just discovered, I hiss "could you, pleeze!, let me work on my stuff - I am busy!!" - and as she glances over she sees that personalized ad on a financial website I just sift through: some voluptuous, smiling girls and the line "looking for an exciting date?". AHA! You are busy, huh?
Damn! What can I do about the Ads google pushes there?
Oh, she is not stupid, types the same URL in her identical browser and at the position where I have that click-for-chicks-Ad she gets a cute little advertisement on health-food. :/
I reload my site. "time for nature - discover marokko" - ha! She reloads "investment-strategies", I "cars"…. and on we go, fortunately diving deep into randomness. So, obviously, our privacy-settings are good enough to feed us not-so-personalized advertisements. And so we go on working on our projects … while I reload the site some more times to get another glance at the dangerously attractive first ad.
Thinking about it, personalized advertisements are not bad after all.
When you happen to live in Berlin and have shown interest in theater, isn't it better to get some commercial suggestions on upcoming shows in your neighborhood than annoying stuff about cruise-ships and wellness-hotels? The unease results from the background data-collection and complex evaluation that google does while you use it's browser. It does not only help them target you for ad-campaigns. Who knows what else they might be interested in.
The good thing: you have a choice. You can opt-out of the personalized ad service from google - and supposedly the data-collection is stopped then.