From Sciam (via Boing Boing), comes news of an ocular implant designed to act as a telephoto system and correct macular regeneration. It doesn’t seem like a huge leap from this little series of lenses implanted in the pupil, to the same sort of thing being used for cosmetic enhancement purposes.
So I made it to Cafe Scientifique again this month. I’m two for two since I started going; here’s hoping I make it to more.
The topic this month was “Nature vs Nurture Revisited: New research is changing the age-old debate.” The expert panel was Robert Gerlai, Ph.D (Dept. of psychology, UofT) and Christopher E. Pearson, Ph.D (Dept. of genetics and genome biology, SickKids Research Institute). I expected there to be more talk about specific issues of NvN, like sexual orientation or intelligence. Instead, the talk mostly dealt (as far as I saw) with the biological mechanisms that come into play.
IANAEB, but in a nutshell there are two processes being discussed here: Darwinian evolution and Lamarckian evolution1. Darwinian evolution is of course the process by which a gene or group of genes that produces a phenotype conducive to its own propagation will tend to be better represented in the gene pool, beating out its allele rivals. Lamarckian evolution is the theory that an organism can pass on traits it has acquired during its own lifetime.
Sounds weird, right? Our DNA sequence is fixed — how could we possibly pass on genes we weren’t born with? Well, it turns out that cytosine (the “C” in GATTACA) can be methylated or de-methylated by the introduction of certain substances, and that this new form can have different phenotypic effects. This process is referred to as epigenetics. So what happens is, someone eats a diet2 containing a substance that toggles the methylation state of a certain sequence in some of their cells, including germ cells. It doesn’t affect them, or not much, because they have already developed into a human. However, the new methylation state is persistent. So when one of those germ cells becomes another person, there is a chance that the altered sequence will trigger some aberrant effect.
This has all sorts of implications, some good and some bad. On the one hand, we might find that certain diseases are caused or exacerbated by a certain methylated state on a certain gene, and that simply making sure that the population gets a certain amount of some nutrient will reduce the incidence of that disease. On the other hand, this may increase the amount of medicating that we do. Drug companies and “alternative medicine” manufacturers may jump on the band wagon, marketing products that claim to “de-methylate your cancer genes” or whatever, playing on people’s ignorance and fear to get them to eat more pills. What’s more, there may be even more pressure than there already is on women to treat their bodies like baby machines and to make sure that even before they start thinking about reproducing they maintain a diet that will produce the optimal methylation state in their bodies.
Just as with any new technology or discovery, the recent findings in epigenetics contain potential for a lot of good and a lot of evil. The talk today was very informative, and I’m grateful to the two panelists for taking the time to make it out. Today I learned about a facet of evolution that I would never have imagined existed.
2 This could also be the result of other environmental influences, such as atmosphere composition, but diet seems the most effective since we actually have in-built mechanism for distributing food’s components around our bodies.
An israeli group is raising the bar for people working with artificial neural networks. Yael Hanein of Tel Aviv University and her team have construed a way to get neuron clusterss to arrange themselves in neat patterns on a sheet of quartz, by using 100-μm-thick bundles of — you guessed it — nanotubes. This greatly increases the efficiency and lifespan of these neuron clusters, and is the first step toward sophisticated biosensors, neuronal grafting and — as one of the commentors on the New Scientist article said — “Cylons that behave like mice”.
As an alternative to a flexible wearable display, how about a subdermal display? Or else one painted onto the skin? OhmyNews.com, a source for some occasionally startling tech news, reports that this may be in our future.
There was only a brief mention of my favourite option (video), but more details on another technique which would paint three thin layers onto the skin: two conductive matrices aligned orthogonally to each other, with a special ink solution between the two. It’s a pretty cool read, and lends hope to those of us who hope to have digital clocks glowing through our skin before too long.
Before you get too excited, this technology has not actually been implemented yet. But a recent article on OhMyNews.com describes the way a simple robot may be printed on a standard printer, modified to use special polymer inks. There’s a rundown of all the components required, and details on how each may be printed. Did you know it is possible to print a 1.5 volt battery? I didn’t, until I read this article.
Imagine how much fun kids are going to have: design a papercraft robot on the computer, print it out, program it to fly around and follow you down the street. Program two robots to fight each other. Make an origami rosebud and watch it bloom. Just fill in the blanks: Make a _______, program it to _______ and watch it _______.
Just about any futurist has sexy dreams about nanotechnology. Imagine: flawlessly building items on the molecular level from base elements. Solving all our scarcity problems: energy, resources, food. With the ability to create anything from the molecules up we are as gods.
Want to know more about nanotech?
- Nanoethics.org is a good place to start. It’s a mostly non-technical site that gives a good overview of nanotech, as well as a rundown of the pros and cons.
- The International Nanotechnology and Science Network is a research group that delves into the technology and implications of nanotech.
- K. Eric Drexler maintains a site primarily about nanotechnology and distributed computing. There are oodles of technical articles for you to enjoy.
- Nanotechnology Now is a high-volume news site that looks at all things nano-scale. Printable robots, fuel cell tech, nanomedecine — it’s all there.
Last month I talked about swarming robots learning to play football together. More recently, NASA and MIT sent a little satellite called a “droid” up to the ISS. They’re in the process of teaching it to navigate around the station, then they’re going to send up some more identical droids and teach them to fly in formation.
Of course, they’re not just flying them all with one remote control — that would be cheating. Instead, the satellites would act as swarmed robots, communicating position and velocity and other relevant data to each other and operating completely autonomously. Once the programmers on the project (dubbed SPHERES) perfect their navigational capabilities they’ll begin teaching them to do other stuff, such as building and repairing space-borne structures.
Rosie. R Daneel. K-9. Data. R2-D2. I think you know what I’m getting at.
But no Buffybots.
Here’s the latest advance in tactile sensors. From the SciAm article:
The device, a so-called electroluminescent thin film, glows in response to applied pressure. The result is a finely detailed image of the texture of any object that touches the film. [...] Because the sensor produces data in the form of an optical image, the data can be quickly and easily collected by simply photographing the image. This represents a major step forward in the ease and efficiency of collecting information from tactile sensors. Quick data collection is critical to performing real-time tasks, for example grasping a tool with a robotic arm. If the tool starts to slip, the image produced by the electroluminescent film immediately shows the tool’s motion, and the robot’s grip can then be adjusted to prevent it from falling.
Of course, this technology could be applied to either robots or cyborgs, although the latter would involve translating the image received into immpulses the human brain would understand.