Wednesday, July 29, 2009

Chubby Knowledge

For those of you going though Fat Knowledge withdrawal pains, check out Chubby Knowledge. It has the same great taste as Fat Knowledge with half the calories.

Long time reader Rebelfish created it as a "sequel":

Chubby Knowledge is a continuation of the Fat Knowledge blog looking at our planet's and country's most important scientific, technological, and hilarious issues.
I wish Rebelfish the best of luck with the new blog and that all the knowledge he drops be fat chubby.

Read More...

Tuesday, July 07, 2009

That's All Folks

That's it for this blog.

I want to thank all the loyal Fat Knowledge readers out there who took the time to read all the posts that I wrote. Special thanks to all those who took time to leave a thoughtful comment or send an email or link to a post on your own blog.

I never thought I would have written 1920 posts or had over 100 RSS readers, or had over 1/3 of a million visitors and over 1/2 a million page views. While those numbers are dwarfed by the many other blogs out there, my goal was never to get lots of readers. Instead the blog was a way to force me to think through interesting and important issues and share it with others. I was glad if anyone wanted to read along and amazed at how many of you did.

May all your knowledge be fat.

Read More...

On Web and iPhone, a Tool to Aid Careful Shopping

These days, every skin lotion and dish detergent on store shelves gloats about how green it is. How do shoppers know which are good for them and good for the earth?

Hence GoodGuide, a Web site and iPhone application that lets consumers dig past the package’s marketing spiel by entering a product’s name and discovering its health, environmental and social impacts.

“What we’re trying to do is flip the whole marketing world on its head,” said Mr. O’Rourke. “Instead of companies telling you what to believe, customers are making the statements to the marketers about what they care about.”

The next version of the iPhone will enable people to scan bar codes to get scores, rather than type in the product’s name.
I wrote about GoodGuide before and I think giving social and environmental information to people at the point of purchase (ideally by mobile phone via bar scanner) is a great idea. I have been writing about this concept since way back in April of 06 and all the posts with a Labeling label are about this concept.
Although the GoodGuide Web site, which started in September, had only 110,000 unique visitors in April, Mr. O’Rourke is encouraged that it is growing about 25 percent a month. Lately, interest in GoodGuide has begun to extend beyond techies and the Whole Foods crowd to the Wal-Mart crowd, as Mr. O’Rourke put it. One sign of that broader appeal: Apple recently featured the app in its iPhone ads.

GoodGuide’s office, in San Francisco, has 12 full-time and 12 part-time employees, half scientists and half engineers. They have scored 75,000 products with data from nearly 200 sources, including government databases, studies by nonprofits and academics, and the research by scientists on the GoodGuide staff. There are still holes in the data that GoodGuide seeks to fill.

Some companies, including Clorox and SC Johnson, have agreed in recent months to reveal more about the ingredients in their products because of gathering consumer concern. That will enable GoodGuide to fill gaps in its data. Federal law does not require makers of household products to list all ingredients.

This summer, GoodGuide will add a deeper database for users who want more detail by, for example, reading the academic studies on which ratings may be based.
It is going to take a while to catch on, and only a fraction of the population will ever use such an application, but even that will be enough to get manufacturers concerned about it and work toward higher ratings.

The more data that companies make available the better the accuracy of the ratings will be. And more detail on how scores were set would be greatly appreciated by those of us that really want to dig in and understand what the scores mean.

The article talks about how their business model is still evolving but it is run for profit and currently gets its money by linking to Amazon and in the future by charging companies like Whole Foods that want to put the ratings in their store. I am curious if maybe a non-profit model might work better here with funding from philanthropists, and much of the work done by volunteers. Or maybe a Consumer Reports model where end users pay for a subscription.

via NY Times

Read More...

Sunday, July 05, 2009

Chris Anderson on the Future of Free

Chris Anderson speaks about his latest book Free: The Future of A Radical Price:

“What it says is that anything that becomes digital will become free — not to say that everything online will be free, but that everything will be available in a free version, so that fundamentally, you’re either going to be competing with free or you’re going to be making a product free and selling something else, because the marginal cost for these products is the same for everybody — which is to say zero.”

Anderson sees many areas of digital content as obeying this law, including music, video, and video games (the big three “shiny disc” industries), news, books, and e-mail. Under Anderson’s model, people will continue to pay good money to save time (that is, those who have more money than time will), lower their risk (such as paying to assure that their Second Life land will still be there, or that their operating system will be supported), because they love something (such as buying virtual items in free videogames), or to increase their status in a community.
I think the topic of digital economics is very interesting and I looked at it in depth earlier in my 8 funding models to support digital goods creation post.

I agree with him about paying to save time (as I wrote long time ago that time is more valuable than money in the attention economy) but that opens up a big loophole where people will still use iTunes, Netflix and Amazon to purchase media because it is a lot faster to use them than trying to find a free version somewhere. You can also use this to justify in game purchases (saves time buying things rather than earning them) and software (better to pay for standard one everyone knows, than the new one you would have to learn).

I am not so sure about his other three reasons. I think the Second Life and virtual item purchases show that in a closed system you can still charge for digital goods, as you have complete control of who can get what and how (iPhone apps being another example). And paying for support is not really a digital good as you are paying for someone's time.
Anderson comes up with the following rules for media companies trying to figure out how to make money online:

1. The best model is a mix of free and paid
2. You can’t charge for an exclusive that will be repeated elsewhere,
3. Don’t charge for the most popular content on your site,
4. Content behind a pay wall should appeal to niches, the narrower the niche the better

This is somewhat counterintuitive because it means media sites that want to charge for content should charge for their niche stuff instead of their most popular content. But that is exactly the right way to look at it if you want to maximize your advertising revenues. Let the popular content be paid for by advertising, and the niche, exclusive content can be sold to fewer people at a higher price. Anderson, whose last book was the Long Tail, predicts in media: “The head of the curve will be free and the tail of the curve will be paid.”
Interesting take on giving the head away for free and charging for the tail (hmm, that sounds dirty). The hardcore fans, those with a great interest and those that need the information for work will be willing to pay for additional content that is very specific and isn't commonly available (I am thinking of music artists' blogs, ESPN's insider and WSJ content).

I also think you could charge for earlier access. Give those that subscribe access to content hours or days before you release it for free. Let them see the exclusive material before everyone else gets it.

I think this analysis also misses the fact that the free market might not be the best way to support digital content production. Funding from government, donations and having content created by volunteers might be the better way to go in many cases.

via Wired and TechCrunch

Read More...

Saturday, July 04, 2009

Satellite Detects Red Glow To Map Global Ocean Plant Health

Researchers from Oregon State University, NASA and other organizations said today that they have succeeded for the first time in measuring the physiology of marine phytoplankton through satellite measurements of its fluorescence – an accomplishment that had been elusive for years.

With this new tool and the continued use of the MODIS Aqua satellite, scientists will now be able to gain a reasonably accurate picture of the ocean's health and productivity about every week, all over the planet.

Data such as this will be critically important in evaluating the effect on oceans of global warming, climate change, desertification and other changes, the researchers said. It will also be a key to determining which areas of the ocean are limited in their productivity by iron deficiency – as this study just showed the Indian Ocean was.

"Until now we've really struggled to make this technology work and give us the information we need," said Michael Behrenfeld, an OSU professor of botany. "The fluorescence measurements allow us to see from outer space the faint red glow of tiny marine plants, all over the world, and tell whether or not they are healthy. That's pretty cool."

To grow, however, these phytoplankton absorb energy from the sun, and then allow some of that energy to escape as red light that is called fluorescence. The new measurements of fluorescence, literally the dim glow that these plants put off, will help complete the understanding of ocean health on a much broader and more frequent basis.

Some surprises are already in.

It was known, for instance, that parts of the equatorial Pacific Ocean, some regions around Antarctica and parts of the sub-Artic Pacific Ocean below Alaska were limited in production by the poor availability of iron. The newest data, however, show that parts of the northern Indian Ocean during the summer are also iron limited – a phenomenon that had been suggested by some ocean and climate models, but never before confirmed.
Amazing that they can measure the physiology of one of the smallest organisms on the planet from a satellite way above the Earth. This data will help to determine where fertilizing the oceans with iron will be most applicable.

via ScienceDaily

Read More...

New Twitter Research: Men Follow Men and Nobody Tweets

We examined the activity of a random sample of 300,000 Twitter users in May 2009 to find out how people are using the service. We then compared our findings to activity on other social networks and online content production venues. Our findings are very surprising.

We found that an average man is almost twice more likely to follow another man than a woman. Similarly, an average woman is 25% more likely to follow a man than a woman. Finally, an average man is 40% more likely to be followed by another man than by a woman. These results cannot be explained by different tweeting activity - both men and women tweet at the same rate.

twitter research 3.jpg

These results are stunning given what previous research has found in the context of online social networks. On a typical online social network, most of the activity is focused around women - men follow content produced by women they do and do not know, and women follow content produced by women they knowi. Generally, men receive comparatively little attention from other men or from women. We wonder to what extent this pattern of results arises because men and women find the content produced by other men on Twitter more compelling than on a typical social network, and men find the content produced by women less compelling (because of a lack of photo sharing, detailed biographies, etc.).

Interesting. I wonder how this compares with blogs.

Twitter's usage patterns are also very different from a typical on-line social network. A typical Twitter user contributes very rarely. Among Twitter users, the median number of lifetime tweets per user is one. This translates into over half of Twitter users tweeting less than once every 74 days.

twitter research 2.jpg

At the same time there is a small contingent of users who are very active. Specifically, the top 10% of prolific Twitter users accounted for over 90% of tweets. On a typical online social network, the top 10% of users account for 30% of all production. To put Twitter in perspective, consider an unlikely analogue - Wikipedia. There, the top 15% of the most prolific editors account for 90% of Wikipedia's edits ii. In other words, the pattern of contributions on Twitter is more concentrated among the few top users than is the case on Wikipedia, even though Wikipedia is clearly not a communications tool. This implies that Twitter's resembles more of a one-way, one-to-many publishing service more than a two-way, peer-to-peer communication network.

twitter research 1.jpg

While all follow a long tail, some long tails are longer than others.

via Harvard Business Blog (and some more Twitter stats from TechCrunch here)

Read More...

Interesting Articles of the Week

How the US surplus became a deficit.

Why e-books are stuck in a black-and-white world.

Know Thyself: Tracking Every Facet of Life, 24/7/365.

For first time over 1/2 of energy funding goes to clean energy (which has quadrupled in last 4 years).

Giving up my iPod for a Walkman.

Read More...

Friday, July 03, 2009

First Hybrid Solar Power Plant Opens in Israel

Concentrating Solar Power (CSP) plants are an amazing, wonderful, renewable energy technology, as long as the sun is shining. However solar power alone cannot provide on-demand power, especially in the case of off-grid applications. Aora Solar, out of Yavne, Israel, is almost complete with the world’s first ever solar hybrid plant, which will combine concentrated solar power with a hybrid-microturbine to generate power 24 hours a day. This technology could help provide off-grid communities the necessary power without having to run miles of costly transmission lines.

Installation of the new hybrid system is less than 10 days away from being completed at the Kibbutz Sammar in Israel. Once up and running on June 24th, the plant will generate 100 kW of on-demand power plus 170 kW of thermal power. The plant consists of 30 heliostats (mirrors) that track the sun and direct its rays up to the 30-meter tall tower, where all the sunlight from the heliostats is concentrated. This concentrated sunlight heats compressed air, which drives an electric turbine. The tower itself is a welcome change from other power towers we have seen in the past - it actually looks good with its tulip flower shape.

The hybrid part of the plant allows for on-demand power due to its inline microturbine. When the sun has set for the day or if it is cloudy, biodiesel, natural gas, or bio fuels can be used to run the microturbine, which then drives the electric turbine. The hybrid system has the capacity to power 70 homes 24/7.

A hybrid system like this has the potential to provide distributed generation or off-grid power to communities, companies or factories. As production of bio fuels becomes more efficient and sustainable, we’re hoping to see more and more hybrid solar concentrating systems.

I like how it has the ability to use other fuels when the sun isn't shining. This solves the intermittency problem of solar power. I wonder how much it cost to build and how much they will charge per kWh?

via Inhabit and Tree Hugger

Read More...

Sylvia Earle's TED Prize Wish To Protect Our Oceans



Sylvia Earle explains her TED Prize Wish:
To bring knowledge of our oceans to a wide audience and galvanize support in favor of marine protected areas.

We invite a variety of responses from TEDsters in pursuit of this goal:

* Development of technologies that would permit deep sea exploration in order to make the invisible visible
* Supporting (or organizing) expeditions to explore proposed “hope spots”
* Helping make the scientific case for a network of MPAs
* Identifying and exploring candidate MPAs
* Creating a media campaign in support of MPAs
* Backing the upcoming Oceans documentary to ensure wide viewership
I think this is a great wish and hope she is successful in her endeavor. As loyal Fat Knowledge readers know, I am a huge fan of deep sea exploration (mapping the bottom of the ocean is #4 on my list of scientific achievements I am looking forward to) and wish all of NASA's funds would be redirected here. Her Deep Search Foundation is a worthy cause as well.

Aside: It pains me to see that another of the TED prizes went to Jill Tarter who wants to improve SETI. If there is a bigger waste of brain power in the world than SETI, I don't know what it is. While the chance of finding new strange and fascinating terrestrial life at the bottom of the ocean is almost 100%, these guys are spending their time trying to find ET which has worse odds than the lottery.

The more MPAs the better. I believe that MPAs allow fishermen to have larger catches as well, (I am sure I have read this, but can't find the link right now), which gives it an economic rational as well. The World Database on Protected Areas is a cool Google Maps mashup that shows where all the MPAs in the world are located. Hopefully we can go from 1% of the sea being protected to 5% in the next 50 years.

Read More...

Deforestation Causes 'Boom-and-bust' Development In The Amazon

Clearing the Amazon rainforest increases Brazilian communities' wealth and quality of life, but these improvements are short-lived, according to new research published today (12 June) in Science. The study, by an international team including researchers at the University of Cambridge and Imperial College London, shows that levels of development revert back to well below national average levels when the loggers and land clearers move on.

Since 2000, 155 thousand square kilometres of rainforest in the Brazilian Amazon have been cut down for timber, burnt, or cleared for agricultural use. Forest clearance rates have averaged more than 1.8 million hectares per year (roughly the area of Kuwait), and the deforestation frontier is advancing into the forest at a rate of more than four football fields every minute.

The researchers' analysis revealed that the quality of local people's lives –measured through levels of income, literacy and longevity, as mentioned above – increases quickly during the early stages of deforestation. This is probably because people capitalise on newly available natural resources, including timber, minerals and land for pasture, and higher incomes and new roads lead to improved access to education and medical care, and all round better living conditions.

However, the new results suggest that these improvements are transitory, and the level of development returns to below the national average once the area's natural resources have been exploited and the deforestation frontier expands to virgin land. Quality of life pre- and post-deforestation was both substantially lower than the Brazilian national average, and was indistinguishable from one another.
This article shows that greed isn't the problem here but rather short sightedness. That even from a selfish perspective, it is not in anyone's long term interest to cut down the trees. Instead of using an environmental argument to stop those that would cut down the forest, it would be more effective to show how it is not in their selfish long term interest to do so.

via ScienceDaily

Read More...

Thursday, July 02, 2009

Netflix Ratings and The Napoleon Dynamite Problem

Bertoni says it’s partly because of “Napoleon Dynamite,” an indie comedy from 2004 that achieved cult status and went on to become extremely popular on Netflix. It is, Bertoni and others have discovered, maddeningly hard to determine how much people will like it. When Bertoni runs his algorithms on regular hits like “Lethal Weapon” or “Miss Congeniality” and tries to predict how any given Netflix user will rate them, he’s usually within eight-tenths of a star. But with films like “Napoleon Dynamite,” he’s off by an average of 1.2 stars.

The reason, Bertoni says, is that “Napoleon Dynamite” is very weird and very polarizing. It contains a lot of arch, ironic humor, including a famously kooky dance performed by the titular teenage character to help his hapless friend win a student-council election. It’s the type of quirky entertainment that tends to be either loved or despised. The movie has been rated more than two million times in the Netflix database, and the ratings are disproportionately one or five stars.

Worse, close friends who normally share similar film aesthetics often heatedly disagree about whether “Napoleon Dynamite” is a masterpiece or an annoying bit of hipster self-indulgence. When Bertoni saw the movie himself with a group of friends, they argued for hours over it. “Half of them loved it, and half of them hated it,” he told me. “And they couldn’t really say why. It’s just a difficult movie.”

Mathematically speaking, “Napoleon Dynamite” is a very significant problem for the Netflix Prize. Amazingly, Bertoni has deduced that this single movie is causing 15 percent of his remaining error rate; or to put it another way, if Bertoni could anticipate whether you’d like “Napoleon Dynamite” as accurately as he can for other movies, this feat alone would bring him 15 percent of the way to winning the $1 million prize. And while “Napoleon Dynamite” is the worst culprit, it isn’t the only troublemaker. A small subset of other titles have caused almost as much bedevilment among the Netflix Prize competitors. When Bertoni showed me a list of his 25 most-difficult-to-predict movies, I noticed they were all similar in some way to “Napoleon Dynamite” — culturally or politically polarizing and hard to classify, including “I Heart Huckabees,” “Lost in Translation,” “Fahrenheit 9/11,” “The Life Aquatic With Steve Zissou,” “Kill Bill: Volume 1” and “Sideways.”
I wonder if the problem isn't the movies as much as the rating system itself. Instead of a single 1-5 star rating, maybe they should also include a score for the variance or list the probabilities that you will rank the movie from 1-5 stars.

When you looked at Napoleon Dynamite it could say you have a 30% chance of rating it 1 star, 10% chance for 2 star, 20% 3 star, 10% 4 star and 30% 5 star. Whereas X-Men might look like 5% 1 star, 20% 2 star, 50% 3 star, 20% 4 star and 5% 5 star. While on average you are likely to give both a 3 star rating, you could choose whether to take the gamble on a movie that you might love, or stick with seeing on that you are likely to find very average.

More interesting information on the Netflix Prize and how people are trying to solve it in the article.

I am also curious how much the recommendations for any one person differ between the various engines. Does one engine do a significantly better job of recommending movies for some people than the other engines? If that is the case, then Netflix should allow you which engine you want to use, similar to how you could choose which of staff members on the staffs' picks you wanted to go with (but think twice before going with a Gene pick over a Vincent).

via NY Times

Read More...

Nintendo Wii Vitality Sensor

And it looks like Nintendo's answer to Microsoft's Project Natal is... a pulse detector. Yep, Ninty's just announced the Wii Vitality Sensor, a finger sensor which attaches to the Wiimote to read your pulse. Details on how the accessory is going to be used in games are pretty vague, but it appears the idea is to check stress, help you relax, and just generally chill out and be groovy.
I am curious to see how Nintendo integrates this into games. I used the emWave before and it was kind of cool to see you heart beat show up on the computer screen. The biofeedback made it easier to relax. Other interesting takes on this include the Simmer Down Sprinter, a game by Philips Design and the StressEraser. But, I bet Nintendo will do a much better job of making it fun to use.

I am not a big fan of the wired finger clip though. I hope Nintendo makes this thing wireless and attaches it to your arm or ear instead.

via Engadget

Read More...

Culture Requires a Dense Population

Mark Thomas and his colleagues at University College, London, suggest that cultural sophistication depends on more than just the evolution of intelligence. It also requires a dense population. If correct, this would explain some puzzling features of the archaeological record that have hitherto been put down to the arbitrary nature of what has survived to the present and what has not.

They are trying to explain the pattern of apparent false-starts to modern human culture. The species is now believed to have emerged 150,000-200,000 years ago in Africa and to have begun spreading to the rest of the world about 60,000 years ago. But signs of modern culture, such as shell beads for necklaces, the use of pigments and delicate, sophisticated tools like bone harpoons, do not appear until 90,000 years ago. They then disappear, before popping up again (and also sometimes disappearing), until they really get going around 35,000 years ago in Europe.

The team drew on an earlier insight that it requires a certain number of people to maintain skills and knowledge in a population. Below this level, random effects can be important. The probability of useful inventions being made is low and if only a few have the skills to fabricate the new inventions, they may die without having passed on their knowledge.

In their model, Dr Thomas and his colleagues divided a simulated world into regions with different densities of human groups. Individuals in these groups had certain “skills”, each with an associated degree of complexity. Such skills could be passed on, more or less faithfully, thus yielding an average level of skills that could vary over time. The groups could also exchange skills.

The model suggested that once more than about 50 groups were in contact with one another, the complexity of skills that could be maintained did not increase as the number of groups increased. Rather, it was population density that turned out to be the key to cultural sophistication. The more people there were, the more exchange there was between groups and the richer the culture of each group became.

Dr Thomas therefore suggests that the reason there is so little sign of culture until 90,000 years ago is that there were not enough people to support it. It is at this point that a couple of places in Africa—one in the southernmost tip of the continent and one in eastern Congo—yield signs of jewellery, art and modern weapons. But then they go away again. That, Dr Thomas suggests, corresponds with a period when human numbers shrank. Climate data provides evidence this shrinkage did happen.

According to Dr Thomas, therefore, culture was not invented once, when people had become clever enough, and then gradually built up into the edifice it is today. Rather, it came and went as the population waxed and waned.
via The Economist

Read More...

Wednesday, July 01, 2009

The Ecological Disaster That is Dolphin Safe Tuna

Is Dolphin Safe Tuna really better for the environment? Southern Fried Science makes the case that switching from following dolphins to using floating objects to catch tuna is actually worse overall for sea life.

Let’s compare the bycatch rates of floating object associated tuna and dolphin associated tuna.
“Ten thousand sets of purse seine nets around immature tuna swimming under logs and other debris will cause the deaths of 25 dolphins; 130 million small tunas; 513,870 mahi mahi; 139,580 sharks; 118,660 wahoo; 30,050 rainbow runners; 12,680 other small fish; 6540 billfish; 2980 yellowtail; 200 other large fish; 1020 sea turtles; and 50 triggerfish.”
“Ten thousand sets of purse seine nets around mature yellowfin swimming in association with dolphins, will cause the deaths of 4000 dolphins (0.04 percent of a population that replenishes itself at the rate of two to six percent per year); 70,000 small tunas; 100 mahi mahi; 3 other small fish; 520 billfish; 30 other large fish; and 100 sea turtles. No sharks, no wahoo, no rainbow runners, no yellowtail, and no triggerfish and dramatic reductions in all other species but dolphins.”
In other words… the only species that “dolphin safe” tuna is good for is dolphins! The bycatch rate for EVERY OTHER species is lower when fishing dolphin-associated tuna vs. floating object associated tuna! The reason for this is obvious- floating objects attract everything nearby, while dolphins following tuna doesn’t attract any other species.

If you work out the math on this (and you don’t have to, because the environmental justice foundation did) , you find that 1 dolphin saved costs 382 mahi-mahi, 188 wahoo, 82 yellowtail and other large fish, 27 sharks, and almost 1,200 small fish.

By trying to help dolphins, groups like Greenpeace caused one of the worst marine ecological disasters of all time. Few other fisheries are as bad for groups like sharks and sea turtles as the purse seine fishery, and none are as large in scale.
More information in the post about how the types of fishing actually work.

Unless you have a great love for dolphins over other types of sea life, following dolphin to catch tuna is the preferable way to go. Of course this also calls out the need for a replacement to floating object fishing. Maybe autonomous robotic submarines could find them? Or high powered satellites? Or GPS tagging? The other solution is to become a Sardinista and switch from eating tuna to smaller fish that can be caught with fewer bycatch.

This blog's whole dolphins are actually jerks section is great reading as well.

Read More...

Wearable Patch Will Count Calories Burned And Consumed

It could be a dieter's best friend or worst nightmare: technology that knows how much a person has just eaten, knows how many calories he has burned off, offers suggestions for improving resolve and success, and never lets him cheat. And it's all done by a small, stick-on monitor no bigger than a large Band-Aid.

The calorie monitor, which is being developed by biotech incubator PhiloMetron, uses a combination of sensors, electrodes, and accelerometers that--together with a unique algorithm--measure the number of calories eaten, the number of calories burned, and the net gain or loss over a 24-hour period. The patch sends this data via a Bluetooth wireless connection to a dieter's cell phone, where an application tracks the totals and provides support.

PhiloMetron won't yet reveal exactly what makes its patch tick, but the company says that it consists of a single chip surrounded by numerous sensors, electrodes, and accelerometers, embedded in a foam adhesive patch. The system, which is designed to be replaced once a week, measures a variety of things (temperature, heart rate, respiratory rate, skin conductivity, possibly even the amount of fluid in the body), then throws the data into an algorithm to calculate the number of calories consumed, the number burned, and the net yield. Caloric-intake measurements are accurate only to about 500 calories--about two Snickers candy bars. But PhiloMetron CEO Darrel Drinan says that it is much more accurate in determining net gain or loss and is most useful for measuring trends over the course of a week or a month. In fact, the system only provides users with rolling 24-hour totals and no instantaneous data.
Cool concept. I am curious how it determines the number of calories eaten. The accuracy of plus or minus 500 calories doesn't seem too good, but hopefully that can be improved in future releases.

Now that smart phones with internet access are becoming commonplace, the next big wave in mobile devices will to interact with the body, or what I call "The Human APIs". This will be of benefit to those with chronic diseases such as diabetes but also healthy people that are looking for ways to improve their health, fitness and concentration even more.

via Technology Review via FuturePundit

Read More...

IBM Invests in Lithium-Air Batteries

IBM Research is beginning an ambitious project that it hopes will lead to the commercialization of batteries that store 10 times as much energy as today's within the next five years. The company will partner with U.S. national labs to develop a promising but controversial technology that uses energy-dense but highly flammable lithium metal to react with oxygen in the air. The payoff, says the company, will be a lightweight, powerful, and rechargeable battery for the electrical grid and the electrification of transportation.

Lithium metal-air batteries can store a tremendous amount of energy--in theory, more than 5,000 watt-hours per kilogram. That's more than ten-times as much as today's high-performance lithium-ion batteries, and more than another class of energy-storage devices: fuel cells. Instead of containing a second reactant inside the cell, these batteries react with oxygen in the air that's pulled in as needed, making them lightweight and compact.

"With all foreseeable developments, lithium-ion batteries are only going to get about two times better than they are today," he says. "To really make an impact on transportation and on the grid, you need higher energy density than that." One of the project's goals, says Narayan, is a lightweight 500-mile battery for a family car. The Chevy Volt can go 40 miles before using the gas tank, and Tesla Motors' Model S line can travel up to 300 miles without a recharge.
10 times the energy in the next 5 years sounds good to me. Best of luck to them.

In related news:
The Cleantech Group’s numbers show an uptick in venture-capital funding for batteries in the first quarter, even as overall US venture investments fell to the lowest level since 1997, according to the National Venture Capital Association.

Spurred by federal cash, electric cars, and demand for ever more powerful gadgets, investment in advanced batteries has bucked the recessionary slump and, energy analysts say, could help the economy recover.

Cleantech tracked $94 million in advanced-battery investments in the previous quarter, up substantially from a recession-affected $29 million in the last quarter of 2008 and up slightly from $90 million in the first quarter of that year.
The more battery research the better.

via Technology Review

Read More...

Global Warming Skeptics Responsible for Collapse of Economy

I just finished reading Green Hell, and the obvious conclusion from reading this is that global warming skeptics are responsible for the current collapse of the economy. Why you ask?

First, the highest priority of global warming skeptics like Steven Milloy is to keep the economy strong. Their biggest concern is that greens will enact global warming and other environmental regulations that will "destroy the economy".

Aside: Steven Milloy not only missed what really destroys economies, he missed what destroys companies as well. In his book, he singles out Ford's CEO and the former CEO of Goldman Sachs as greens who harmed their shareholders with their beliefs. Yes, if only these CEOs had acted more like their competitors at Chrysler, GM, Bear Sterns and Lehman Brothers, their shareholder would be much better off. Oh, wait...

Second, they pride themselves in being able to find flaws in climate models.

Why then are they responsible for the current destruction of the economy? Because the same skills that can find flaws in climate models can also find flaws in financial models. Had they focused in on this hockey stick graph of housing prices to household income,

rather than this hockey stick graph of temperature risings,

they could have warned us about the housing bubble and stopped it from destroying the economy. Instead of focusing in on real threats to the economy, they spent their time fighting against hypothetical environmental regulations that have never been passed and in so doing are responsible for allowing our economy to crash.

Read More...