California Needs Texas for Cleantech Success

By Joel Serface – May 28, 2009


When I moved from Silicon Valley to Austin in 2006, many of my VC friends were left scratching their heads… Why would someone who has been leading the cleantech charge in California want to move to Texas?  After all, there was conventional thinking in California that there was no hope for Texas and that only the California-way would lead to cleantech success.


I had many motivations including helping Texas become a renewable energy state.  The rationale was this…. If you want the greatest leverage in mitigating carbon emissions, start with the most carbon-polluting state in the most carbon-polluting country in the world (this was before China surpassed the US in carbon emissions).  If you make progress in Texas, then other states and countries would understand they could make the transition as well.  If you don’t show up, engage, and get the state (more importantly its people, investors, and industry) to buy in, then you cannot expedite progress or bridge the necessary gaps to accelerate the cleantech industry in Texas. 


The fact was that Texas has always been a leader in energy including renewables.  Much of the early work in solar happened in Texas at Texas Instruments under the leadership of Jack Kilby in the 1960’s and 1970’s (remember those early solar calculators?).  Like California, Texas had its share of early “successes”, but many of those disappeared in the 1980’s as federal support for renewables waned.  It wasn’t until many leaders in Texas got together to push wind energy in the late 1990’s that renewables appeared as a scalable and reasonable technology.  While California had invested into several generations of wind technologies covering its valuable lands with poorly performing wind turbines, Texas didn’t develop a policy until around the same time 1.5 MW wind turbines became commercially viable.  With the combination of a good wind policy (first-come, first serve REC availability), competitive asset pricing, and low land lease rates, the wind industry in Texas took hold. 


Since then, Texas has developed around 8 GW of wind energy with more than 15 GW planned.  To support this, Texas became a leader in transmission policy developing Competitive Renewable Energy Zones (CREZ), which are now being copied in Western states and other parts of the country.  It has also led in transmission development to renewables with 18.5 GW of new capacity approved to be developed to strong wind and solar areas of the state.  Texas will also go live in its own transmission grid, the Electric Reliability Council of Texas (ERCOT), with the most advanced “nodal” market allowing more entry points for renewables, storage, and ancillary services.  In short, Texas has had its own renewable successes even though they are not as sexy or as publicized as what has been done in California.


California’s strengths are well-known and publicized.  There is no better-experienced region in the world in taking ideas from laboratories and technology entrepreneurs and turning them into products.  California has also been an energy policy innovator historically in clean air and energy efficiency, and more recently in policies for carbon (AB 32), transportation (AB 1493), fuels, and cleantech investment (Greenwave Initiative).  The scope of the technology and policy innovation in the state has allowed it to be a thought leader while seeing some of the early returns from its efforts.  California’s strengths come from its researchers, entrepreneurs, and investors that all think they can change the world.  In short, there are no limits to what Californians think they can accomplish and therefore no limits in its scope of innovation.


Texas’ strength in energy runs deep in the veins of its people.  It starts with a “can-do” or “wildcatting” nature of its people, extends to land development, project development, industrial scalability, and energy trading.  Texans have always taken energy risks and developed core competencies in scaling and optimizing massive processes for chemical and petroleum production.  They have also developed critical technologies for extracting and transporting energy from its origin across vast areas to deliver it where it is needed.  Most of this experience is in extracting, refining, and converting hydrocarbons, but it can also be applied to all aspects of cleantech.  In short, Texas knows how to scale energy technologies and once it is given a price or incentive will become the leader in delivering new forms of energy.


If California represents scope and Texas represents scale, then we need both to transition cleantech ideas from lab to market at an ever-increasing pace.  So what needs to happen to achieve the scope of California and the scale of Texas? 


First, new interfaces need to be built.  If they are, we can accelerate the early and the late to more broadly deploy renewables.  Both Texas and California need to dismiss their pre-conceived notions that their respective approach is best.  The nation needs policy and technology innovators as well as deployment and market innovators.  In the middle is the need for a new dialog and new interfaces especially around how to tie ideas from California into projects in Texas.  There also has to be acknowledgement that California isn’t the only place ideas come from or can be built into companies.  It might actually be better to develop these technologies closer to the points of adoption or at least understand customer and integration needs from the outset.


Second, Texas needs to learn from California and develop policies that support more renewables and energy efficiency.  In the Texas wind case, the state waited to develop a policy just ahead of the time when asset performance of wind turbines was about to achieve price parity with traditional electrical generation.  We are on the precipice of this with solar and other technologies.  If Texas doesn’t adopt policies in this legislative session, it will be left on the “solar sidelines” while other states and countries continue to develop their solar industries, achieve economies of scale, and geographic advantage.  This would lead Texas down the path of possibly importing solar panels as opposed to developing its own domestic solar industry.  If Texas indeed learns from other states and adopts policies more aggressively, then the scaled industries will take hold in Texas and grow faster.


Third, California needs to recognize the potential in developing projects in Texas.  Texas has created a favorable environment for the energy business and has been ahead of the curve in market transformation in order to do so.  This coupled with their demonstrated success in delivering large energy projects gives them a tremendous lead in deploying new energy technologies at a massive scale.  In fact, many of the incentive approaches for wind, transmission, and transmission grid management for renewables should be examined at a national level.


Fourth, Texas cannot sit on the sidelines on carbon pricing.  It is in Texas’ best interest to have a predictable carbon target and therefore price.  This will mobilize many of the traditional energy companies and utilities to get off the sidelines and begin investing into the future energy industry and building their future business models (new financial, trading, and integration models are likely where Texas will succeed). 


Finally, new investing models need to be attempted combining early and late stage investing.  A great deal of attention needs to be paid to the “valley of death” between development of new energy technologies and their delivery in large scale to integrated projects.  While Federal loan guarantees and Federal test and integration centers will be useful here, it will require experienced investors, developers, and corporations to step in, provide financing, and minimizing risk ultimately accelerating these implementations to market.  Texas could become the large-scale test-bed for these implementations.

To make this all work, Texas needs to step forward in this legislative session to begin embracing solar energy and other forms of renewables as well as energy efficiency.  The state’s leadership also needs to announce their support for renewable energy and endorse its associated economic opportunities for the state.  If a pragmatic and immediate approach is developed in working together with industry and California (and other states), the results will be a healthy, high-growth new energy economy, increased numbers of jobs, greater global competitiveness, and enhanced energy and economic security for the United States (and Texas and California). 



Joel Serface served as NREL’s first Entrepreneur in Residence with Kleiner Perkins Caufield & Byers.  As an investor and entrepreneur, Joel has planted cleantech seeds in Massachusetts, California, Texas, and now Colorado.  Since 2000, Joel has started or invested into more than 20 cleantech companies with 5 liquidity events so far and has catalyzed the formation of numerous supporting cleantech institutions and regional and national policy initiatives.

 

Feed-In Tariff = Feeding at Trough?

by Richard T. Stuebi

One of the more popular policy prescriptions often made by ardent renewable energy advocates is the adoption of a “feed-in tariff” (FIT).

With a FIT, the government sets a price for electricity supplied by a qualifying renewable energy source, and the price is usually sufficiently high to produce a good return for the investor to install the renewable energy project. This, in turn, provides a substantial economic motivation for the growth of the renewable energy sector.

Supporters love the fact that a FIT policy provides a long-term, stable, predictable, and lucrative return on renewable energy investment. Naturally, this leads to booming markets for renewable energy where FITs are in place.

FITs are in wide use in many parts of the world – mainly in Europe, but increasingly in Canada as well. Correspondingly, these markets are experiencing exploding growth for renewables.

However, to date, traction has been slow to come for FITs in the U.S. because the policy mechanism is innately at odds with the prevailing philosophy of the American economy: to let market forces sort things out.

In the U.S., the renewable portfolio standard (RPS) has been the preferred policy mechanism to promote the penetration of renewable energy (along with the predictable potpourri of incentives and subsidies buried in the piles of the tax codes). In an RPS, the government sets a target for a quantity of renewables to be adopted by a certain date – and then lets market forces dictate what mix of renewables will supply the requirement, as well as the price implications of that mix.

By contrast, a FIT explicitly puts the government in the position of price-setter, and picks technological winners by placing prices as a function of the renewable energy technology in question.

If the price of the FIT is set too high, unquestionably this pushes renewable energy adoption, but tramples competitive forces in doing so: bad (meaning, to me, highly-uneconomic) projects get done, and/or companies or investors make outrageous profits. On the other hand, if the price of the FIT is set too low, then the policy won’t have any impact at all: no incremental investment in the desired renewables will occur.

In other words, the government has to be able to set the price at exactly the right level to induce a lot of investment – but no higher so as to provide a free wealth grab, and no lower so as to discourage the market from happening at all. No government is that smart to be able to perfectly set the price of a FIT. So, in practice, FIT prices are very high – and the renewable energy interests profit immensely from it.

Although FIT policy has historically gone nowhere in the U.S., that may be changing, as FITs are starting to get more serious consideration. In early 2008, the California Public Utilities Commission adopted the first FIT in the U.S., to promote up to a maximum of 480 megawatts installed. Earlier this year, the city of Gainesville, Florida enacted a feed-in tariff for its municipal utility. Even in Michigan, not considered one of the leading states in pro-renewables policies, the Public Service Commission is considering a pilot feed-in tariff.

I am not sold on the FIT mechanism as good policy, because it is so heavy-handed and arbitrary. However, as the rest of the world adopts FIT policies, they extend their leadership over the U.S. – and the leadership is not just in market size, but also in technological advancement. If the U.S. doesn’t maintain technological leadership, then we’ve lost arguably our best asset. If a FIT policy is necessary to be leaders in renewable energy, then maybe it’s a necessary evil.

It wouldn’t be the first time I’d have had to swallow hard in lukewarmly supporting a policy that otherwise I find fundamentally challenging.

Some have argued that the aggregate economic subsidy associated with a national FIT policy is outweighed by the faster reduction in costs associated with renewable energy advancement promoted by the FIT, plus the avoided expenditures on fossil fuels displaced by the increased renewable energy production caused by the FIT. It’s an interesting argument, but counter-intuitive to me, and I’d like to see some quantitative support for this line of reasoning.

Richard T. Stuebi is the Fellow for Energy and Environmental Advancement at The Cleveland Foundation, and is also the Founder and President of NextWave Energy, Inc. Later in 2009, he will also become Managing Director of Early Stage Partners.

The Efficacy of Biofuels from Algae on Cleantech.org

I usually don’t do this, but a couple of days ago we had a post on Cleantech.org’s Linked In group around algal processes, feedstocks, and the recent DOE solicitation, that engendered a lively discussion, in part taking off from the recent demise of Green Fuels.

While many of you know I am not personally a fan of algal fuels, I have posted it en masse, unedited, so enjoy, as the discussion ranges across a decent chunk of the issues facing algae processes and provides some food for thought.

Urgent – Algae Oil Production or Algae Methane Production Needed!
We are completing a DOE grant application design to meet our Notice of Intent by next Friday and need to find one or two companies with a process to make Algae Oil or Algae Methane, or either, for our process. Please email any information or contacts as our time line is running short for this grant. We believe we have lined up most all other pieces for this proposed biorefinery!

Posted 2 days ago Reply Privately Make featured Delete discussion

Walter Breidenstein
Professional Entrepreneur

See all Walter’s discussions »

Comments (24)
Poly Endrasik Jr.
Video/Web Conferencing & Teleworking Consultant
Hi Walter, Maybe you could pick up this technology for a song and take it somewhere:

http://www.greentechmedia.com/articles/read/greenfuel-technologies-closing-down-4670/

http://www.ecogeek.org/content/view/2747/70/ – both these are the same story!

Good Luck and God bless
Posted 2 days ago Reply Privately Delete comment
Walter Breidenstein
Professional Entrepreneur
Hi Poly,

That is why we turn down all VC investments into our company. They are best left to Universities and University students who manage a lot of deals that once one folds they can jump to the next one without a lot of pain. Where I come from we don’t throw other people’s money at deals…unless those investors who came in early can support those who come in later. Most VC deals are so ugly after the first and second round that who would ever want to support a technology with those types of “investors”. Not me!

Walt.
Posted 2 days ago Reply Privately Delete comment
Neil Farbstein
President of Vulvox Nanobiotechnology Corporation
Algae have several problems that make them untenable. Algal production systems use so much water that they will damage the environment,competing with city municipal reservoirs, agricultural water and they will drain rivers that support wildlife. CSP solar thermal uses a lot less water and some designs use no water to generate clean cheap electricity.
Posted 2 days ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
Neil – I would suggest that is myopic. There are many alternatives and many end products. Saying that electricity is the only solution is impractical since electricity does not give us any near term solutions for the vast network of spark and compression based ignition systems.

Walter asked for support on algae, stick to the topic. Walter, I dropped you a parrallel note… As one of the areas I am working on is a non-proprietary solution to put algae farming in the hands of who better? Farmers. My part in the process is the development of a low cost photo bioreactor and trying to engage the agricultual extension service in the mix. If that is a help to you or others, please connect.

There are still realistic challenges like best lipid extraction mechanism. Final protocols for maximizing lipid production are also in order. Some parrallel gadgets to be built include the PBR, a low cost easy operation lipid fraction meter, an oil/lipid extraction gizmo, etc.

The more we share the more likely we are to win/win…

Leif
Posted 2 days ago Reply Privately Delete comment
Walter Breidenstein
Professional Entrepreneur
Leif/Neil,

Our process produces water from the production of the methane. We could use that excess water for the algae systems if that would be helpful. We also produce near pure CO2 and we understand this could also be helpful. At this stage we just want methane sources without the algae oil if feasible. My background is oil & gas so I know methane, ethane and propane down the chain. I am not, nor my engineers, familiar with the bio/algae world as experts. We have lots of engineering firms contacting us to help us, but we really are just looking for designers who understand the algae space to complete this DOE grant. We have until next Friday for the Intent and our budget is around $25 million. We think we have a very strong chance to win this grant…but we need the CO2-algae-methane piece…or part of it to be proven. I know, contact Bill Gates and Sapphire Energy but it appears DOE grants are not going to impact their $100 million last funding round! :)

Walt.
Posted 2 days ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
Not sure I follow the direction of your need. I am not clear whether you are making methane, or consuming methane. I take it is making… I would assume you could decompose the algae to create a methane source, but like most methane sources, it wouldn’t be clear. Temp conversion/pyrolysis could be an option but certainly you know that.

Given that I am not tracking where you are heading, I am unlikely to be of help.

And I thought that while DOE expected to award some large, the easy high end was $5M and 24 months…

Leif
Posted 2 days ago Reply Privately Delete comment
Walter Breidenstein
Professional Entrepreneur
Lief,

Sorry I was not clear. We need methane for our process to make methanol.

Walt.
Posted 2 days ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
Yep I am of no help to you. I don’t have a good way for a clean source of methane. Lots out there, but not sure of metabolic pathway from the algae I work with.

Good luck.

Leif
Posted 2 days ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
I assume you are doing this against ARPA-E – did you catch the updated amendment on that?

Leif
Posted 1 day ago Reply Privately Delete comment
Lubo Morhac
Technology Management Consultant
Hi Walter,

I have several links for you to research relating to algae to fuel. I don’t think the following outfits have algae cultures that are capable of CH4 production, but fatty acids for sure:
This one is my favourite in terms of equipment:
http://www.algaelink.com/

also check these:
http://www.solixbiofuels.com/html/company.html
http://www.petroalgae.com/
http://www.greenfuelonline.com/
http://www.livefuels.com/

Landfill sites are an excellent source of CH4.
Some gasification systems may be of interest with Methanization back end.
but of course, best of luck with algae,

Lubo
Posted 1 day ago Reply Privately Delete comment
Lubo Morhac
Technology Management Consultant
Walter, I re-read the thread and I think this may be of interest as an alternative for turning CO2 into energy:
http://www.uafsunstar.com/20090317/sandia-technology-turns-sunshine-petrol
http://www.carbonsciences.com/

L.
Posted 1 day ago Reply Privately Delete comment
Walter Breidenstein
Professional Entrepreneur
Wow, thanks for the information guys. We need methane…that is what we need. We can work with Algae oil to make biodiesel since methanol is used in the biodiesel, but right now we want the most simple system. CO2-Algae-Methane-Methanol…we will recycle our water and CO2 nicely.

Poly, I spoke to my licensing friend at MIT and article you posted, “GreenFuel Technologies Closing Down” was just searched and there is no reference to that project at MIT. He called Harvard for me and they have no mention of it, but they have heard of it. They believe it was something a student started on the roof, and MIT says that any student who develops anything at their University is the owner of the IP. Thus, the article says it is an MIT-Harvard algae project that crashed, but my friend said there is no record of the project he could find, nor at Harvard…so maybe the author was mistaken…

Yes, finding Algae to Methane is not so easy!
Posted 1 day ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
The problem in part is that your target their is “swamp gas” from algae rather than the oil output. My challenge is that is a different species, in fact I have no clue what species that might be, vs the standard oil rich species e.g. chlorella …
Posted 1 day ago Reply Privately Delete comment
Walter Breidenstein
Professional Entrepreneur
Leif,

Here is the acceptable feedstocks from the grant…consider we need methane:

“Using the definitions of “renewable biomass” as stated in the Energy Policy Act of 2005 (EPAct 2005), the Energy Independence and Security Act of 2007 (EISA 2007), and the Food, Conservation, and Energy Act of 2008, Title IX, Sec. 9001, as guidance, for the purpose of this FOA, the acceptable feedstocks will be those listed below:
(A) materials, pre-commercial thinnings, or invasive species from National Forest System land and public lands (as defined in section 103 of the Federal Land Policy and Management Act of 1976 (43 U.S.C. 1702)) that –
(i) are byproducts of preventive treatments that are removed –
(I) to reduce hazardous fuels;
(II) to reduce or contain disease or insect infestation; or
(III) to restore ecosystem health;
(ii) would not otherwise be used for higher-value products; and
(iii) are harvested in accordance with –
(I) applicable law and land management plans; and
(II) the requirements for
i. old-growth maintenance, restoration, and management direction of paragraphs (2), (3), and (4) of subsection (e) of section 102 of the Healthy Forests Restoration Act of 2003 (16 U.S.C. 6512); and
ii. large-tree retention of subsection (f) of that section; or
(B) organic matter that is available on a renewable or recurring basis from non-Federal land or land belonging to an Indian or Indian tribe that is held in trust by the United States or subject to a restriction against alienation imposed by the United States, including –
(i) renewable plant material, including –
(I) organic material grown for the purposes of being converted to energy; and
(II) algae; and
(ii) waste material, including –
(I) crop residue (including cobs, stover, bagasse and other residues);
(II) other vegetative waste material (including wood waste and wood residues);
(III) food waste and yard waste.

No plant based material that is generally intended for use as food can be employed as a feedstock except as noted below under “Additional Feedstocks Acceptable For Topic Areas 5 and 6.” Hence, sugars derived from sugarcane or beets and oils derived from soy, canola, sunflower, peanut, etc. normally recovered using conventional food processing methods will be excluded from eligibility for this FOA. The determining factor will be the typical use of the material in commerce. Use of excess oil production of food-grade oil also does not constitute an acceptable feedstock. Distillers Dried Grains with Soluble (DDGS) is also excluded. Additional information regarding the use of algae as a feedstock is included in Appendix J.

Municipal Solid Waste (MSW) is not an acceptable feedstock. However, biomass as defined in EPAct 2005 (Public Law 109-58) Section 932(a)(1-2) that is segregated from the MSW as a separate stream, could be employed as a feedstock with appropriate considerations for the costs of such segregation, collection, processing, and transportation. Hence, post-sorted MSW, where all recyclables and non-biomass components have been removed, would qualify, but only the remaining dry material that meets the above requirements would qualify as a feedstock for purposes of this FOA. Allowable costs include processing (such as, chipping or grinding) the feedstock into a form that can be fed into the reactor. Processing costs for MSW are restricted to post-sorted materials.”

That is not an easy list to find methane…except here:

“A new method for converting algae into natural gas for use in pipelines and power generation has been transferred to the marketplace under a license between Genifuel Corp. and Battelle. Genefuel is based in Salt Lake City, and has an exclusive license for the technology.”

http://www.genifuel.com/ – maybe this is the only one?
Posted 1 day ago Reply Privately Delete comment
Karel Beelaerts van Blokland
Dutchmen Absolute Return F: 07-37% /08-100% /09- 5,4% – dutchmencapital.web-log.nl / kacobeelaerts@zonnet.nl
AlgaeLink N.V. is a Dutch Company that designs and manufactures algae growing equipment. Algaelink are building a world-wide supply chain and network that is sustainable and delivers value to our global customers . Our operations cover algae production, equipment, consultancy, installation support and training.

Fuel Green energy, biodiesel, bio-ethanol, bio-gas, bio-oil, and jet fuel (JV with AirFrance-KLM).
www.algaeLink.com
Posted 1 day ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
Walt – my point is in part to explain the tangential answers. Most of us (with all the negative broad brush implications that implies) are focused on the extraction of the large lipid fraction from algea and therefore area focused on microalgae – commonly Chlorella, and other variants of the small motile buggers since lipid fractions can reach 50% in some claims. That oil then become the feedstock for a biodiesel process.

The algae you are after are just different. You are looking for a swamp/march algae (or pnd scum), likely long strain clumpy stuff most people try to kill. A source would be https://ccmp.bigelow.org/ which is a national repository for many such things.

My issue is I just haven’t focused on it. You might be able to find help and support in the reverse from your local agricultural extension agent.

I think I had misread the feedstocks grant to assume it precluded algae – not 100% which one you are pursuing.

Given the time and the inclination, you or I could come up with the right kind of algae and the people involved. You are looking for the swamp biology professor – not anyone talking about algae for biofuels. Not a bad thing, just a different thing.

You are welcome to call me if it would help – 540 847 5343.

Leif
Posted 1 day ago Reply Privately Delete comment
Walter Breidenstein
Professional Entrepreneur
Leif,
I will see if I can get my engineer to call you as he is just now getting started on all these calculations. We know how much methane we need to produce methanol. We know how much methanol is needed to produce biodiesel. We know how much oil is needed to produce biodiesel. We will likely need 5-10 times more oil-algae than methane-algae to have a tight, packaged CO2-to-biodiesel system. We wonder if that amount exists already in stable systems (i.e. before they go in and kill off the “bad” algae)? Interesting dilemma…I’m sure the answer is out there at some of these Universities and DOE labs who get all the “fun money” to do the R&D.
Walt.
Posted 1 day ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
That is part of the dilemma – there is much talk and speculation, but other than a haxane oil extraction standard, the only thing that is talked about is pyrolysis to derive a clean oil residue and that is a piss awful waste of energy. Ultimately that is why I think that is why some folks are tanking, because without extraction mechanisms, algae is a tough nut.

The one I am holding out for is algae ‘milking” to extract the oil while the algae is still alive. But I fear that may turn our processes from open to proprietary.

To be fair from your earlier post, you can decompose algae, food, and other wastes that aren’t muni solid – so you should be able to leverage sewage or other feedstocks. I think those folks are really your targets and the organisms in the Archaea group are the metanogens you seek…

Leif
Posted 1 day ago Reply Privately Delete comment
Matt Sloustcher
Account Executive at Peppercom Strategic Communications
Walter,

Nobody has mentioned the heterotrophic “in the dark” method of algae oil production Solazyme employs. I suggest you review the following blog post, and check out Greentech Media’s analysis of the industry.

http://www.oilgae.com/blog/2009/02/advantages-of-heterotropic-algae-for.html
Posted 1 day ago Reply Privately Delete comment
Christine Harmel
PR
I would suggest OriginOil http://www.originoil.com/
Posted 22 hours ago Reply Privately Delete comment

Comments (24)
Walter Breidenstein
Professional Entrepreneur
Has anyone studied the cost accuracy associated with this Algae-methane process? Everything boils down to CAPEX and OPEX in these models, and this looks interesting.

http://www.unh.edu/p2/biodiesel/pdf/algae_salton_sea.pdf
Posted 20 hours ago Reply Privately Delete comment
Leif Johnston
Technology Consultant and Serial Entrepreneur
Big picture you are still decomposing the algal as the methane creation process with techniques not 100% clear to me and combine with complicating compounds in the decomposition gases, sulpher containing mercaptans etc. Which still leaves you with the need for a decomposition specialist…
Posted 17 hours ago Reply Privately Delete comment
Frédéric Vogel
Research group leader at Paul Scherrer Institut
Dear Walter,

I know that I’m too late for your grant application. Nevertheless, you might be interested to know that we are developing a process similar to the one Genifuel has licensed from PNNL. The strong feature of our process is the recovery of all nutrients in a concentrated brine, besides the efficient production of methane. We have recently published a paper accessible to anyone:

http://www.rsc.org/Publishing/Journals/EE/article.asp?doi=b819874h

Feel free to connect if you think some further discussion might be of interest.

Frédéric
Posted 1 hour ago Reply Privately Delete comment
Walter Breidenstein
Professional Entrepreneur
Frederic,
Thank you for the very interesting information. We have not reached any agreement with Genifuel yet, but I have had one brief discussion and a couple email exchanges. I get the feeling they are at the top of their game and have their own uses for methane from their website. I’m not convinced as I know the methane markets extremely well and not a day passes that I don’t hear of another methane technology that will be “easily converted to liquids”. I’ve traveled the world on researching and studying methane conversion, and it just is not as easy as some would have you believe. Therefore, I would be most interested in your technology. We remain open and ready to do business with anyone that can integrate their value chain into ours. Further, the grant is not due until June 30 (so you are not too late) while the Notice of Intent is due by next Friday. We remain committed to find some help in Algae to Methane technologies. We think we can add value to whatever is the methane source.
Walt.
Posted 42 minutes ago Reply Privately Delete comment

Neal Dikeman is a partner at cleantech merchant bank Jane Capital Partners and Chairman of Carbonflow, Inc. and Cleantech.org.

New Cars that Already Meet 2016 Fuel Economy Standards

By John Addison. President Barack Obama announced that automakers must meet average U.S. fuel-economy standards of 35.5 miles per gallon by 2016. This will be an exciting opportunity for automakers that already deliver vehicles that beat 35.5 mpg such as the Ford (F) Fusion Hybrid, Mercury Milan Hybrid, Toyota (TM) Prius, Honda (HMC) Insight, Honda Civic Hybrid, and the Mercedes Smart Fortwo. You can buy these gas misers today. A number of other vehicles offered in the U.S. now come close to the 2016 standard, and will see mileage improvements next year.

In Europe, over 100 models can be purchased that meet the 2016 standards, thanks to the popularity of cars that are smaller, lighter weight, and often use efficient turbo diesel engines.

Over the next three years, dozens of exciting cars will be introduced in the United States. Here are some offerings that we are likely to see in the next one to three years from major auto makers.

Ford (F) will extend its current hybrid success with added models. During my recent test-drive of several vehicles that meet the 2016 requirement the midsized Ford Fusion Hybrid demonstrates that you can enjoy fuel economy in a larger car with comfort and safety. The Ford Fusion Hybrid has an EPA certified rating of 41 mpg in the city and 36 mpg on the highway. The car can be driven up to 47 mph in electric mode with no gasoline being consumed. Ford will start selling pure battery electric vehicles next year that will lower its fleet mileage average.

The best mileage SUV on the market is the Ford Escape Hybrid with 32 mpg. In 2012, Ford will also offer a plug-in version of the Escape Hybrid that will blow-away the 35.5 mile standard. Bringing the popular Fiesta to the U.S. with a 1.6L gasoline engine will also attract budget minded buyers looking for good mileage.

In discussing the new standards, Ford CEO Alan Mulally stated, “We are pleased President Obama is taking decisive and positive action as we work together toward one national standard for vehicle fuel economy and greenhouse gas emissions that will benefit the environment and the economy.”

General Motors (GM) plans to be the leader in plug-in hybrids starting with the Chevy Volt. It has a major opportunity to extend its E-Flex architecture to SUVs and trucks by 2016. For the price conscious buyer, the Chevy Spark hatchback with a 1.2L gasoline engine should deliver over 40 mpg.

There are almost 40,000 Chrysler GEM electric vehicles in use today. The GEM 25 mph speed limits them to only being popular in fleets, university towns, and retirement communities. Chrysler will extend its early U.S. electric vehicle leadership in 2010 with new freeway speed plug-in hybrids that can be driven 40 miles in electric mode, before engaging the gasoline engine – the Jeep Wrangler, an SUV, and the Town and Country Minivan. Over time, Chrysler can expand its ENVI family. Chrysler’s new stockholder Fiat will bring in exciting smaller cars and help expand the EV success.

Toyota (TM) will expand on the success of the Prius with more new hybrids. Since 2002, I have been driving a Prius that has averaged 41 mpg in real world driving that has included climbing hills with bikes on a roof rack and driving through snow with skis on the roof rack. The Prius will also be made available as a plug-in hybrid – hundreds of these PHEVs are now being tested by fleets. The modestly priced Yaris, which gets 32 mpg, is likely to also be offered as a hybrid that delivers over 40 mpg.

Honda (HMC) is likely to be the first maker to meet 2016 CAFÉ requirements, building on its historical leadership in fuel economy. My mother has easily achieved over 45 mpg with her Honda Civic Hybrid. Now Honda is going after the Toyota Prius with the Honda Insight. The popular Fit, which gets 31 mpg, is likely to also be offered as a hybrid offering over 40 mpg. Look for more high mileage offerings from both Honda and Toyota as they compete for hybrid leadership.

Nissan’s (NSANY) Altima Hybrid delivers an impressive 34 mpg. Beyond hybrids, Nissan is determined to be the leader in battery electric vehicles. Working with fleet consortiums and major electric utilities, next year Nissan may seed the market with thousands of freeway speed electric vehicles. The Nissan EVs have ranges of at least 100 miles per charge. Clean Fleet Report EV Test Drive

This article does not pretend to be a complete review of what is coming, rather a taste of what is here and what will soon be here from six major automakers. Given economic challenges, not all forecasts will happen. There will be surprises, more new models, and new model names. Not all plans will be executed as Chrysler deals with bankruptcy reorganization and as GM considers one.

Meeting the CAFÉ standards by 2016 will not be a slam dunk for all of the automakers, but they will make it. Historically, CAFE standards have not aligned with the EPA fuel economy determinations used in this article. For better and worse, flexfuel vehicles get artificially high numbers, making it easier for GM, Ford, and Chrysler to meet CAFE targets. Plug-in hybrid and EV ratings need to be finalized. To meet fleet average requirements, cars will need to average higher than 35.5; light-trucks and SUVs lower.

Trends to more efficient drive systems are a certainty. With oil prices now close to double the recent lows of earlier this year, these new vehicles bring important relief to every driver who wants to save at the pump.

John Addison publishes the Clean Fleet Report and details the future of transportation in his new book Save Gas, Save the Planet.

If Larry King Wrote My Column….

by Richard T. Stuebi

You heard it here first: the energy consultancy Douglas-Westwood is claiming in a May 11 white paper that “peak oil” may have already happened, as far back as October 2004, and that the oil price boom followed by economic collapse is indicative of how things will play out over the decades to come as oil supplies are unable to expand in the face of increasing demands. Stay tuned….

The American Wind Energy Association (AWEA) exposition WINDPOWER 2009 attracted 23,000 attendees to Chicago earlier this month. Glad AWEA didn’t ask me to do the headcount!….

Your stock portfolio isn’t the only thing that’s plummeted. According to a snippet in the March 2009 issue of Power, so too have PV prices fallen, by an estimated 10% since last October, with a further 15-20% decline expected in the coming year. Seems that, after several years of tight supplies, there’s now a glut in the market, due to collapsing demand in Europe….

Lots happening in DC these days. Looks like cap-and-trade requirements for carbon dioxide emissions are making real progress, embodied in the grandiosely called “The American Clean Energy and Security Act” (H.R. 2454) — better known as the Waxman-Markey bill. Cap-and-trade might even pass the House sometime this summer. Don’t think it’s going to be so easy in the Senate, though….

The U.S. Department of Energy (DOE) has created ARPA-E, to fund the initial evaluation of new whiz-bang ideas for energy, just like DARPA’s been doing for out-of-the-box defense gizmos for decades. One can only imagine what’s going to come out of that shop in years to come….

It also appears that the e-DII concept floated by Brookings earlier this year, to create Clean Energy Innovation Centers mainly affiliated with universities, is gaining traction, now having been tucked into the Waxman-Markey bill. Wonder what the national research labs, such as NREL, NETL, ORNL, LBNL and other alphabet soupers, think of this?….

Speaking of NREL, hats off to Joel Serface, who just completed a year’s residence there on behalf of uber-VC firm Kleiner Perkins to help accelerate technology commercialization and spin-outs from the lab. A year in Golden/Boulder is hardly hardship duty, but as Joel indicates in a recent post at this very CleanTechBlog site, it wasn’t enough time to make much of a dent in the bureaucracy….

Congratulations to my former colleague Cathy Zoi, who’s been tabbed by President Obama to lead the Office of Energy Efficiency and Renewable Energy at DOE. Wish her good luck: she’ll need it!….

Let’s hear it for Joseph Romm, now a Senior Fellow at the Center for American Progress. He calls ‘em like he sees ‘em. In a note in the May/June Technology Review, Romm claims “it’s not possible to have a sustained economic recovery that isn’t green” and calls our economic system a “global Ponzi scheme: investors (i.e., current generations) are paying themselves (i.e., you and me) by taking from future generations.” Whew!….

The U.S. Chamber of Commerce just released a study performed by Charles River Associates estimating 3 million jobs to lost in the U.S. by 2030 as a result of climate change legislation. Last year, the Chamber commissioned a similar study announcing a similar doom-and-gloom result. I’m not saying there won’t be job losses as a result of cap-and-trade – there certainly will – but I don’t think it’s going to be apocalyptic either….

Gotta hand it to Bob Galvin, former Chairman of Motorola. Not content to be retired, he has launched the Galvin Electricity Initiative to promote a “Perfect Power System” to help prevent future blackouts. In a sense, he’s trying to Galvinize the grid….

Last Wednesday evening, the Cleveland Chapter of the American Jewish Committee honored The Cleveland Foundation for its advanced energy initiative. Accepting the award on behalf of the Foundation was President and CEO Ronn Richard. A good time was had by all….

Richard T. Stuebi is the Fellow for Energy and Environmental Advancement at The Cleveland Foundation, and is also the Founder and President of NextWave Energy, Inc. Later in 2009, he will also become Managing Director of Early Stage Partners.

Biofuel Industry Hopes to Recover with Next Generation Fuels

By John Addison. Scientists know how to make fuel from prairie grasses growing on marginal land. They know how to make fuel from fast growing trees with root systems that extend 25 feet into the ground, sequestering carbon emissions and enriching the soil. They even know how to make fuel from algae. They do all this in their labs every day. The problem is making cellulosic and algal fuel in large quantities at costs that compete with fuels from petroleum such as gasoline, diesel, and jet fuel.

This is my second article (previous article) from the 31st Symposium on Biotechnology for Fuels and Chemicals sponsored by NREL. 800 global bioscientists gathered in San Francisco to share their research and showcase their progress.

Their progress with biofuels from cellulosic sources is important. Some corn ethanol plants have closed. Once promising corporations, such as VeraSun, are now bankrupt. Lifecycle greenhouse gas emissions for fuel-from-food are being scrutinized. Industry would benefit from biomass that can be grown at much higher yields per acre than corn. Industries such as agriculture, wood, and paper would benefit from making money from waste and from having added revenue sources.

At the conference, Verenium (VRNM) shared their progress. In Jennings, Louisiana, they are producing 1.4 million gallons per year of cellulosic ethanol. The fuel can be mixed up to 10 percent with our current gasoline, saving us from needing almost 1.4 million gallons of foreign oil each year. Some might be delivered as E85. Instead of using corn, which requires high inputs of energy, nitrogen, fertilizer, and water to produce, Verenium is using a crop that produces eight times the energy required to process it – energy cane, a hybrid of sugar cane optimized as a fuel source not a food source.

Sugarcane and energy cane are part of Brazil’s energy independence, being the source of over 40 percent of their fuel. Now energy cane is being grown in some of the more tropical places in the United States. At a time when project financing is difficult, major partners are critical to financing larger commercial plants. In a joint-venture with BP, Verenium plans to build a 36 million gallon per year plant in Florida.

Dr. Stuart Thomas with DuPont Danisco Cellulosic Ethanol (DD, DNSCY.PK) outlined their plans to bring a 20 million gallon per year plant on line in 2012. They are evaluating non-food feedstocks with much higher yields per acre than corn, including switchgrass and sorghum. DuPont Danisco anticipates reaching parity with $60 to $100/barrel oil by 2015. The pilot plant will be in Tennessee which is providing $70 million of funding for ethanol from switchgrass.

The long-term potential for biofuels may not be in ethanol, but in renewable gasoline, biodiesel, bio-jet fuel, and biocrude. All contain more energy than ethanol, which only delivers 84,000 BTU/gallon. Gasoline delivers 114,000; biodiesel 120,000.

With better microbes and fewer process steps, Chief scientist Dr. Steve del Cardayre with LS9, presented plans to produce industry standard biodiesel from energy cane. The plant should be able to compete with oil at today’s prices by also producing other valuable outputs, such as chemicals which can be used to make detergents. Synthetic biology competitor, Amyris, is moving even faster in building process plants to convert energy cane into renewable hydrocarbons and bio-jet fuel.

Indeed, creating multiple products from a process plant is likely to be critical to having a profitable industry. Oil refining is profitable because fractional distillation creates many valuable products at one refiner:

· Naphtha which can be processed into chemicals and plastics
· Gasoline
· Jet fuel
· Diesel
· Heavy oils which can be processed into lubricants and asphalt

Gevo will build plants with mass efficiency of over 40 percent that can produce multiple products including:
· Bio-jet fuel
· Bio-diesel
· Isobutanol for other products

Gevo sees opportunities to buy existing moth-balled ethanol plants and retrofit for $30 million per plant, a fraction of building a cellulosic plant from scratch. Gevo’s yeast fermentation process produces heat and steam which would be valuable if co-located with industrial processes that benefit from combined heat and power.

By converting wood waste to next generation fuel, Mascoma has a significant potential to co-locate with existing paper mills and wood processing operations. The same is true for Range Fuels.

Enerkem is being paid to covert municipal solid waste into fuel as it targets 2011 to bring live a 9.6 million gallon per year plant in Edmonton, Canada, and a 20 million gallon per year plant in Pontotoc, Mississippi.

Beyond the cellulosic sources for fuel, covered in this article, is the potential for fuel from algae. A future article will examine the near-term challenges and long-term potential of algal fuel.

As this Symposium took place in California, in Copenhagen, Greenpeace protesters stopped all buses because they use biofuel from food sources. In the future, they may welcome biofuel from wood and waste sources as an alternative to gasoline from tar sands and jet fuel from coal.

This December, the leaders of the world will gather in Copenhagen, Denmark, to develop a framework for a more promising sustainable future. In Denmark they will be able to visit a new cellulosic ethanol plant developed by Inbicon. The feedstock will be an agricultural waste product – wheat straw. The plant will process 24 metric tons per day of wheat straw, ten times more than a demonstration plant that Inbicon only a few years ago. The plant will be more efficient and come closer to competing with refined oil because the operation will have three products creating three revenue streams:

1. 5.4 million liters ethanol year
2. 8,250 MT biofuel which will displace some coal used by a power plant
3. 11,250 MT of molasses which will be used to feed cattle

Can such operations displace all our need for petroleum? No, but in five years we will see commercial scale next generation biofuel operations. If oil is selling for $100 dollar per barrel, then cellulosic biofuels may lower our cost of fuel. In ten years, all such operations could displace 20 percent of our petroleum use and represent an important step towards energy independence.

Cellulosic ethanol is not the only sustainable solution that world leaders will see in Copenhagen. They will see at least 40 percent of the population commuting on bicycles, demonstrating an immediate and very cost-effective way to reduce our need for oil. Many delegates will ride on electric light-rail from the airport and notice the wind farms that deliver the electricity. Some will ride in electric cars that further demonstrate transportation that uses renewable energy.

Next generation biofuels promise to be part of a portfolio of solutions to our current climate and energy problems.

John Addison publishes the Clean Fleet Report and speaks at conferences. He is the author of the new book – Save Gas, Save the Planet – now selling at Amazon and other booksellers.

My Year as NREL’s Entrepreneur in Residence

by Joel Serface

I just spent an amazing year at the National Renewable Energy Laboratory (NREL), but have no start-ups to show for it (yet).

A year ago, I was asked by
Kleiner Perkins to be the first Entrepreneur in Residence (EIR) at NREL. As a person who has been into energy and environmental technologies since gradeschool and as an early cleantech investor, it was an opportunity of a lifetime to become the first NREL EIR. It was a fantastic time spent with some of the best cleantech researchers in the world. I felt like a kid in a candy store. I tremendously added to my depth and breadth of cleantech history and knowledge.

The program itself was a grand experiment that I commend the Department of Energy for attempting. DOE’s calculus was that if they inserted a serial entrepreneur/investor backed by a brand-named VC firm into a lab that magic would happen and that an innovation would turn immediately into a company. At worst, DOE would learn a lot about what it and its labs need to do better to in order to accelerate ideas to market.

In the 11 months that I had the privilege to work inside NREL, I met with more than 300 researchers, identified around 30 promising technologies that I thought could reach commercial potential over the next several years, and honed in on 3 technologies that showed imminent promise. Unfortunately, the EIR program was timed too short to reach its full potential and to get the first one of these ideas set up as a company.


When building a new program into a research institution, timing is critically important. Based on my experience running the
Austin Clean Energy Incubator at The University of Texas, it took almost 11 months to start my first company. In 18 months, I had helped start 5 companies. In total, these companies raised more than $200 million, but none surpassed KP’s investment hurdle.

When I agreed to become NREL’s EIR, I set the expectation with DOE, NREL, and KP that starting a company that KP would back within one year should not be expected. While there are a tremendous number of opportunities for commercialization at NREL, they need to temporally match a VC firm’s thesis, meet its perceived portfolio needs, or surpass its hurdle for innovation. Given enough time, many of the 30 technologies described above could be built into companies, but not necessarily into ones KP would fund over the period of the EIR Program.


A more reasonable expectation for all was to use this program to begin developing long-term relationships with VCs and start-ups that helped the lab and DOE develop better tools and processes. If successful, this could help NREL deliver more companies or successful collaborations for the entire industry. With this approach in mind, there were many things learned by all parties that could benefit the entire venture capital and start-up industry. Here is what I learned…


First, NREL truly is “The National Renewable Energy Lab”. There is more breadth and depth of renewable energy and energy efficiency knowledge at NREL than any other institution on the planet. This alone is worth the price of admission. Unfortunately, the admission price has never been posted and there have only been secret alley entrances with secured doors to gain access to the lab. The lesson here is that new interfaces need to be developed by the lab to better expose its collective knowledge and translate it to the marketplace more effectively (thus EIR and other programs).

Second, the value in NREL is not just in its innovation, but more importantly in the value it can deliver across the life cycle of a technology…

  • Innovation – Yes, NREL has a great pool of researchers and ideas. They also have a network of other labs and universities they collaborate with (MIT, Stanford, University of Colorado, etc.). They will also soon be the hub of all DOE renewable energy intellectual property by managing DOE’s IP Portal.
  • Acceleration – NREL’s experience allows them to solve critical issues for external technologies and companies. Success stories abound from NREL helping First Solar, Uni-Solar, Clipper Wind, and many others. Identifying new ways to open up NREL to solve critical issues in start-ups is critical to the VC industry.
  • Analysis – NREL has a large division that does market, techno-economic, scaling, integration, policy, and plant design analysis. This primarily is developed for DOE and Congress (which really does not take advantage of this tremendous asset), but needs to be exposed to the financial services and venture capital sectors. I would encourage any thesis-driven VC firm or investment bank to review the work that has already been delivered by NREL.
  • Testing / Validation – NREL provides the service of testing all flavors or renewable energy, storage, transportation, building, and energy efficiency technologies. They even integrate multiple technologies as systems and perform accelerated testing. NREL’s validation not only helps get products designed into projects, it also provides critical feedback for future development.
  • Deployment – NREL has a cities and states program that helps advise on local policies, design parameters, and integrated solutions. NREL will increasingly be involved in regional test and implementation centers that will help scale technologies into cities and integrated pilot facilities.


Finally, NREL will only get better; now is the time to begin forging long-term relationships with them. With additional funding, increased DOE support, stronger linkage to national priorities, and new management focused on commercialization and market needs, NREL will deliver increasing value to the cleantech community. By becoming more intertwined with our imminent national priorities and community needs, the lab will increase its “NRELevance” in our nation’s day-to-day existence.


So, what next’s next for the NREL EIR? Over the short run, I will help deliver a national energy efficiency initiative focused on schools with the help of NREL. I will also continue supporting NREL as an entrepreneur/investor and as an advocate of the lab’s potential. I will also continue nurturing the many wonderful relationships I began forging through this program. And, yes, there will be start-ups forthcoming, unfortunately not within the short period of the EIR Program.


Thanks again to DOE, NREL, and KP for inviting me into this unique and invaluable experience. I hope that my time at NREL has made a difference there. If NREL is successful with its new management team and tools, then the entire cleantech community and nation will benefit.

Joel Serface served as NREL’s first Entrepreneur in Residence with Kleiner Perkins Caufield & Byers. As an investor and entrepreneur, Joel has planted cleantech seeds in Massachusetts, California, Texas, and now Colorado. Since 2000, Joel has started or invested into more than 20 cleantech companies with 5 liquidity events so far and has catalyzed the formation of numerous supporting cleantech institutions and regional and national policy initiatives.

Blogroll Review: Corny Carpet, Cocoa Car, and Carbon Consolidation

Pretty much everything you eat these days contains corn, whether in the form of corn syrup, sauces, starch, or other food additives. Pretty soon, we will also get upholstery made from this plant. Already being used for biofuels, corn is also a chemical feedstock.

Joel Makower shared this story from his attendance of a gathering of investors and entrepreneurs in cleantech:

For example, there’s a carpeting fiber made from corn instead of petro-based nylon that requires nearly a third less energy and emits nearly two-thirds fewer greenhouse gases. It is being manufactured at a repurposed polyester factory.

This is just one example of many, where businesses see as an opportunity to further sustainability goals into their plans.

Imagine eating your furniture once it’s ready to be disposed! :)

And speaking of food, Megan Treacy at EcoGeek reports of a racecar that runs on the waste products of chocolate manufacturing. Even more remarkable is that the steering wheel, seat and car body are made from plant fibers including carrots, flax, soy, and other vegetables.

In other news…

* Greentech Media says a shopping spree has begun for carbon accounting software.

* Karla says that Waxman Bill is flawed.

* At VentureBeat, Matt says funding is falling except for energy storage.

* Maria has some cool pictures from the American Wind Energy Association meeting. Check out the small wind turbines!

Thank Goodness for Contrarians

by Richard T. Stuebi

One of my favorite bumper-stickers of all-time reads “My Karma Ran Over Your Dogma”.

In addition to being a wonderful word-play, the one-liner reflects my deep disdain for those who are far-too-certain of their positions — whatever their positions may be. I haven’t done any statistical analysis, but I often find that the strength of people’s opinions is inversely correlated with their knowledge of the subject.

So, it’s actually a service to be reminded by intelligent people offering alternative views with substantial supporting evidence that what we think we really know may not actually be truth.

In the energy realm, I’ve encountered a number of articles by or about very accomplished and expert individuals who don’t subscribe to conventional wisdom.

For instance, in late March, the Sunday New York Times Magazine ran a provocative article called “The Civil Heretic”, profiling the Princeton mathematician Freeman Dyson, who has been the subject of significant and hostile criticism for suggesting (as has Bjorn Lomborg, author of The Skeptical Environmentalist) that too much concern is being paid to the phenomenon of climate change.

On the oil front, Ruchir Sharma, the head of emerging market research at Morgan Stanley (NYSE: MS) wrote an article in the April 20 Newsweek entitled “If It’s In the Ground, It Can Only Go Down”. Sharma doesn’t buy the peak oil theory, and argues that the long-term trend of declining oil prices will re-emerge.

Even if you disagree with their positions, you can’t say that they are stupid people. There are grains of truth in their arguments that we are all well-served to recognize and embrace.

As stated so beautifully in The Tree of Knowledge by Humberto Maturana and Francisco Varela:

“The knowledge of knowledge compels. It compels us to adopt an attitude of permanent vigilance against the temptation of certainty. It compels us to recognize that certainty is not a proof of truth. It compels us to realize that the world everyone sees is not the world but a world.”

We must be honest with ourselves in admitting that the future is not knowable with certainty in advance, and that all projections can at best be only grounded speculations. Being confronted by obviously smart and wise people who hold different views than ours about the future is a good exercise in humility for all of us. If we respond thoughtfully to considerate alternative views, we are driven to re-examine our own thinking and logic, and strengthen or alter it accordingly.

Richard T. Stuebi is the Fellow for Energy and Environmental Advancement at The Cleveland Foundation, and is also the Founder and President of NextWave Energy, Inc. Later in 2009, he will also become Managing Director at Early Stage Partners.

Biofuel Industry – No Money, No Respect

For the moment, the price at the pump is reasonable. A spike in demand or a terrorist disruption, however, will quickly remind us that we are desperately dependent on oil as we continue to consume 140 billion gallons of gasoline per year. Even in these recessionary times of moderate demand, we are running out of easy to extract oil from dessert sands. We are turning to sources of unconventional oil, such as tar sands in Canada, to produce oil with ever increasing greenhouse gas emissions.

For a while corn ethanol looked like a promising way to end our addiction to oil. Now we are like the character in a Woody Allen comedy who explains, “I used to be a heroin addict; now I’m a methadone addict.” At a time when a billion people go hungry, many as a result of disappearing water on this heating planet, fuel from food is not the answer.

Needed is fuel from wood and waste, not food and haste. Some of the world’s best minds are focused on fuel from cellulosic and waste sources, in some cases from biological sources that remove CO2 from the air and enrich depleted soil. I am writing this article from the 31st Symposium on Biotechnology for Fuels and Chemicals sponsored by NREL. 800 global bioscientists have gathered in San Francisco to share their research and showcase their progress.

Many at the conference expressed concern and discouragement. Companies that were once darlings of Wall Street have gone bankrupt. Dozens of ethanol plants have closed as oil prices dropped. Many promising second generation plants cannot get built due to lack of project financing. People with the money see the risk as too high.

There continue to be zero commercial scale (20-million gallon per year and bigger) cellulosic ethanol plants, despite past glowing press releases that declared that they would now be running.

The biofuels industry is also under attack due to food-from-fuel and land use issues. Over one billion people are hungry or starving. Agricultural expert Lester Brown reports, “The grain required to fill an SUV’s 25-gallon tank with ethanol just once will feed one person for a whole year.” Scientific American: Could Food Shortages Bring Down Civilization?

Europe, now California, and soon many U.S. states, now insist that land use must be considered in evaluating biofuels.

During the middle of the conference, a workshop for the media was held. The theme of the workshop quickly became clear – the industry problems were the fault of regulators and we the press.

Professor Bruce Dale, Michigan State University, dismissed corn/soy land use change as an “emotional issue.” He continued, “The California Low Carbon Fuel Standard is intellectually bankrupt.” To demonstrate the flaw of land use, he stated that replacing a gasoline powered vehicle with an electric vehicle would only increase the demand for coal power and therefore do nothing to reduce greenhouse gases.

The example is quite flawed. Automakers consistently tell me that their gasoline powered vehicles are about 15 percent efficient and their electric vehicles are 60 to 70 percent efficient. EVs need much less energy. Even if you could find an EV powered purely with coal, it would produce less lifecycle emissions than a comparable gasoline or corn ethanol fueled vehicle. According to the latest figures published by the U.S. Energy Information Administration (EIA), non-hydro renewable sources of electricity enjoyed double-digit growth during the past year while coal was down by 1.1 percent. Incremental demand for electricity is bringing more renewable energy on-line.

In fact, the California Low Carbon Fuel Standard (LCFS) is based on the peer-reviewed work of scientists using Argonne National Labs GREET model. The work, industry comments, and findings are all available at http://www.arb.ca.gov/fuels/lcfs/lcfs.htm

The LCFS encourages the reduction of greenhouse gas emissions per unit of energy delivered to the wheels of vehicles. The scientific analysis behind the LCFS includes these examples of grams of CO2e emissions per mega joule of energy:

Ø Gasoline Oil Refined 92
Ø Diesel ULSD Refined 71
Ø Diesel Coal-to-Liquid 167
Ø Biodiesel Midwest Soy 30
Ø Ethanol Corn with Coal Electricity 114
Ø Ethanol Cellulosic from Poplar Trees -12
Ø Electricity California Average 27

If the biofuels industry sees a future in biodiesel and cellulosic ethanol, the industry should be encouraged by the findings of the scientists contributing to the LCFS. On the other hand, if the industry is only betting its future on corn ethanol, then the regulation is a threat.

LCFS will not help the expansion of E85 stations for flexfuel vehicles. For the 2009 model year, the best rated car running on E85 in the United States was the Chevrolet HHR, with a United States EPA gasoline mileage rating of 26 miles per gallon, and an E85 rating of only 19 miles per gallon – and that’s the best from Detroit with mileage on all other U.S. flexfuel vehicles being worse. In other words, if you passed on using E85 and drove a hybrid with good mileage, you would double miles per gallon and produce far less greenhouse gas emissions than any U.S. flexfuel offering. Top 10 Low Carbon Footprint Four-Door Sedans for 2009

While the press was being scolded and air regulators were being metaphorically burned at the stake, most conference attendees had an afternoon to enjoy San Francisco. Many traveled using electric-powered buses and the hydro powered BART rapid transit system that carriers 100 million riders annually. So much for the press conference dismissing electric powered transportation as not being feasible.

Although attacking regulators, environmentalists, and advocates for the hungry will not save the biofuel industry, the federal government may save it. As the conference unfolded in California, a major announcement was made in Washington, DC, by U.S. Secretary of Energy Steven Chu when he announced that $786.5 million would be made available to accelerate advanced biofuels research and to help fund commercial-scale biorefinery demonstration projects.

One irony for the biofuel industry is that as oil prices increase, their economic model improves, but consumer demand for fuel moderates as consumers drive fewer miles, use more public transportation, and soon switch in growing numbers to electric vehicles. For decades, however, fuel will be in demand for many passenger vehicles, heavy-vehicles, long-distance goods movement, ships and airplanes. The opportunity is ripe for delivering fuel with lower lifecycle emissions. Promising cellulosic biofuel companies will be covered in my next article.

John Addison publishes the Clean Fleet Report. He is the author of a new book about the future of transportation – Save Gas, Save the Planet.

Director of Congressional Bugdet Office on Cap and Trade

A couple of days ago the Congressional Budget Office Director Douglas Elmendorf wrote about his Senate testimony on cap and trade revenue redistribution on his blog late last week. Worth a quick read, the main text below. The full 28 page testimony is linked in his note. It’s worth noting that the homepage of the CBO has a climate temperature chart on in front and center this week.

“Testimony: The Distribution of Revenues from a Cap-and-Trade Program for Carbon Dioxide Emissions

I testified this morning before the Senate Finance Committee on the distribution of revenues from a cap-and-trade program for carbon dioxide emissions. My comments emphasized these points:

A cap-and-trade program would lead to higher prices for energy and energy-intensive goods, which would provide incentives for households and businesses to use less energy and to develop energy sources that emit less carbon dioxide. Higher relative prices for energy would also shift income among households at different points in the income distribution and across industries and regions of the country. Policymakers could counteract those income shifts by using the revenues from selling emission allowances to compensate certain households or businesses, or by giving allowances away.

In distributing the value of the allowances, policymakers have a wide range of options but face trade-offs. For example:

  • If allowances were auctioned, some of the revenue could be used to fund climate-related research and development. This approach might reduce the cost of transitioning from a high carbon emissions economy, but it would not provide any immediate help to affected industries or households.
  • Instead, auction revenue could be used to reduce existing taxes on capital or labor. This could lessen the overall economic cost of restricting emissions but would do little to offset the burden that higher prices would impose on certain industries or households.
  • A different approach is to use the revenue to give rebates to low-income households, perhaps using the tax system. This would lessen the burden on these households but not trim economy-wide costs.
  • Alternatively, allowances could be given away for free to certain industries. Giving away allowances is generally equivalent to auctioning the allowances and giving the proceeds to the same firms. Giving allowances to energy-intensive manufacturers would not, by itself, hold down the price of their output, which would rise to reflect the private market value of the allowances. The result could be windfall profits for these firms, which would tend to benefit higher-income households who own most stocks. However, if receipt of free allowances was tied to future production or employment, then prices would not rise as much as otherwise. At the same time, because these firms would not reduce emissions as much as they would have without free allowances, other sectors of the economy would have to reduce emissions by a larger amount to meet the overall cap.”

Neal Dikeman is a partner at cleantech merchant bank Jane Capital Partners and Chairman of Carbonflow, Inc. and Cleantech.org.

REDD – The Basis of a “Carbon Federal Reserve”?

Avoiding tropical deforestation – or REDD (Reducing Emissions from Deforestation and Degradation) in the parlance of the emerging policy dynamic – is the most mind twistingly complex endeavor in the carbon game. The fact is that REDD involves scientific uncertainties, technical challenges, heterogeneous non-contiguous asset classes, multi-decade performance guarantees, local land tenure issues, brutal potential for gaming and the fact that getting it wrong means that scam artists will get unimaginably rich while emissions don’t change a bit. You can understand why back in 1997 in Kyoto everybody threw their hands up and just decided this was too hard to try.

But the unfortunate failure to ascribe any economic value to living carbon storage means that forests – mainly tropical – still account for 20% of the world’s emissions annually, about the same as either the US or China. In other words, since Kyoto, tropical forests have fully contributed 2.5 years total of global emissions. That’s a tragedy of unspeakable dimension – and right now it seems the only thing that will slow it is when we actually run out of trees to cut down. Which is apparently not out of the question.

I’vehad the opportunity to think about REDD a lot in the last week. Last weekend, I got invited to the UN to participate in the Forest Dialogue’s (http://research.yale.edu/gisf/tfd) two-day session on REDD financing mechanisms, together with the breadth of interests that define the immensely complex issues around tropical forest resources. Sitting around a table with everybody from indigenous peoples groups, the World Bank, industrial foresters, Conservation International to some governments gives you a good view of how complex the issues and different perspectives really are.

At the Tribeca Film Festival a day later, I saw “The Burning Season” –www.theburningseasonmovie.com – a documentary that followed my friend Dorjee Sun and his start-up company Carbon Conservation on his year-long quixotic journey around an endless planet of boardrooms, plane rides and hotel banquets. All to save crucial forestlands in Aceh, Indonesia, through the sale of avoided deforestation carbon credits, which are currently unrecognized by the Kyoto world. It’s a moving and challenging piece juxtaposed against scenes from an orangutan rescue center overwhelmed with orphans from the forest carnage and the struggles of a local farmer seeking to feed and educate his family at ground zero of the controversy– I highly recommend it.

And then in Washington on Thursday, I was invited to celebrate the 10 year anniversary of one of the most effective and under the radar organizations you’ll ever come across – Forest Trends (www.forest-trends.org). If you don’t know who they are – you should. When Jonathan Lash and Al Gore drop by to give the keynote of thanks of your celebration dinner, you’re certainly doing something right.

When you look at the McKinsey climate wedges or the Stern Report on the needed forward curve for atmospheric stabilization, its blindingly obvious that REDD’s 20% of current GHG emissions has to be part of the solution. And the sooner the better- it’s an asset that is diminishing right in front of our eyes. At the same time, despite its immediate potential, REDD is not a panacea to the climate issue, getting to the required 80% global emissions reduction by 2050 is going to take multiple technology step-downs of heroic proportions. We need to transition major chunks of the global economy to CCS, hydrogen, plug-in hybrids running on next generation biofuels, hyper efficiency, new waves of renewables, nuclear and maybe even fusion. Who knows – but no matter what, it’s a big nut to crack. At best, REDD is only a fraction of that need. Which means we have to weigh our desire for immediate REDD and its ancillary benefits against our desire to accelerate technology development and uptake.

This is where REDD is potentially problematic. Given REDD’s 7 billion ton per year current emission baseline, a one percent shift in REDD emissions per year potentially puts 70M tones of emission credits into the system. To give some perspective, in the current supply and demand balance driven by the EU Trading scheme, that 70M tones per year alone would have dramatically impacted the price for a ton of emission rights. Approximately 4% per year of REDD would have equaled the entire production of the CDM in the Kyoto period. Now contrast against the goals of the Stern Report which sets out a target of reducing REDD by 50% within a decade.

If we were even fractionally successful with that goal, an enormous supply of emission rights might enter the market. If demand were not precisely calibrated to absorb that supply at the right time, the value of emissions would plummet, meaning that a fundamental driver for developing and implementing crucial low carbon technology would disappear.

The problem is that while there are long term aspiration goals for emission reduction (80% reduction by 2050 being a generally accepted target), the transition to that point involves a continually complex calculus of political will and gamesmanship. And with REDD’s potential range of supply ranging from zero to 3.5B tones per year, setting the short term demand curve exactly right is virtually impossible. Set it too high and if REDD underdelivers, we crater the economy. Set it too low and if REDD performs – we set back the economic drivers for emission technologies a decade or two.

Now, admittedly, that kind of runaway success is unlikely and if we truly succeed in rapidly halting deforestation’s advance, well, that is not a bad problem to have. However, one must always beware of policies with unintended consequences and this is one where I certainly see that potential, whatever its likelihood may be.

So, the irony is, we can neither afford to do – or not do – REDD under current thinking and parameters. But we need to be thinking about the future. Even after the US joins the carbon constrained world, emissions will be managed across fewer than 1 billion people. By 2050, it needs to be managed across virtually the entire global economy. Those transitions to greater carbon engagement will be an immense challenge. Every time a new country agrees to be capped – or a capped country ratchets its commitment downward – there is potential for market demand dislocation.

And that’s where I think near-term REDD may play a role. What if industrial governments and the private sector aggregated TARP-like funds – tens or hundreds of billions of dollars – to compensate developing countries and/or private groups within them for immediate and sustainable REDD on a cost-plus basis, as derived by the tonnage of carbon kept out of the atmosphere. We’ll pay a fixed, below market rate today but rather than dropping all that tonnage into the market immediately, it will be held in a global reserve that would enter the market at various points in the future (via a Board of Governors?) when demand/price for emission rights is undergoing a spike, due to new emitters join the cap or when major emission step downs occur (say when the US goes from 20% to 30% reductions). Private investors in such a fund would get a bond-like return – preferably tax free – and the differential between the price paid to the developing country per ton at the outset and its eventual price at release (after interest) into the market would be split in some fashion between the providers of the carbon and the providers of the capital. Seller countries might even get some kind of preferential access to their own credits, to incent them to come under a cap sooner rather than later.

The challenges around executing this are immense and it’s clearly not necessary if REDD only achieves a fraction of its potential. If it does not achieve that potential rapidly, we will have almost certainly lost the remainder of the world forests. It does seem to me that a “Federal Reserve” is one way to solve the conundrum of keeping as much forest carbon on the ground as possible while not allowing its potential market overhang to disincentivize technology development and implementation. But having started fifteen years ago in the forest carbon space and after seeing the same arguments reiterated again and again while the forests of the world have been felled and burned, the honest truth is that we have no time to waste.

Marc Stuart is the Co-Founder and Director of New Business Development for EcoSecurities, a global carbon trading firm. the views expressed in this blog are his own and do not necessarily represent the views of EcoSecurities


What the FERC?

by Richard T. Stuebi

The Federal government is a mighty bureaucracy, so it’s impossible to keep track of all the parts. Still, few areas are as unknown by the general public as the Federal Energy Regulatory Commission (FERC).

The FERC (it’s always referrred to as “The FERC”) is responsible for interstate regulation of energy markets, which in practice means the transmission or transportation of electricity and natural gas. As a result, the FERC is going to be a key player in all Smart Grid developments, which in turn will be a key driver of a variety of new energy technologies — renewable energy, energy storage, advanced meters, and so on.

President Obama recently appointed Jon Wellinghoff to be Chairman of the Commission. Wellinghoff is a long-time proponent of environmental protection, so it’s no surprise that he’s rapidly making moves to promote renewable energy and energy efficiency. For instance, Wellinghoff recently announced the formation of the Office of Energy Policy and Innovation, to be effective today. (Innovation in a Federal agency? Hmmmmm.)

Wellinghoff has already demonstrated the gall to radically challenge conventional wisdom — which is always a risky and courageous thing to do in the electricity sector. In late April, as noted in the New York Times, Wellinghoff told reporters following a United States Energy Association forum that baseload generation options may not be necessary in the future, thereby undercutting one of the key selling-points for the construction or continued operation of nuclear and coal-fired powerplants.

Quoting Wellinghoff: “I think baseload capacity is going to become an anachronism…People talk about ‘Oh, we need baseload.’ It’s like people saying we need more computing power, we need mainframes. We don’t need mainframes; we have distributed computing.”

Of course, Wellinghoff’s seductive vision depends on a major and costly overhaul of the national power grid, which seems light-years away to me. In his seminal New York Times editorial last November, Al Gore projected the cost of a Smart Grid at $400 billion — whereas the American Recovery and Reinvestment Act of 2009 (a.k.a., Stimulus Bill) allocates a seemingly large but comparatively paltry $4.5 billion to Smart Grid projects.

To get over the formidable humps we face in Washington, we’re going to need leaders who are willing to rattle the china on the dinner table. In Wellinghoff, it looks like we have one. His comments no doubt have a lot of people in the energy sector muttering, “What the FERC?”

Richard T. Stuebi is the Fellow of Energy and Environmental Advancement at The Cleveland Foundation, and is also the Founder and President of NextWave Energy, Inc. Later in 2009, he will also become a Managing Director of Early Stage Partners.

In the Beginning … All Costs Were External

By Ed Beardsworth

Are we in just another cycle, where we charge ahead with renewables and care for the environment, but then forget all about it when oil prices drop? The saga is all too familiar, and cynics can’t be blamed for seeing deja-vu all over again.

This time, however, it feels different. Reality seems to have penetrated so many layers and segments of society, government and business. What’s more, there is a very long standing historical trend that lends hope to the notion that we’re really doing it this time – the process of internalizing externalities.

Garret Hardin, famous for creating the concept of “tragedy of the commons”, published a (now out-of-print) book “Exploring New Ethics for Survival–The Voyage of the Spaceship Beagle” (1972). Reaching far into prehistory, he outlines the evolution of the process of “the internalization of so-called external costs” from early pre-history, the time of the cave-man.

The first “cost” to be “internalized” was probably the fruit on a tree. Someone said “mine”. Then, the red dirt you need to make iron. And on from there.

Cost of -When Internalized (approx)

Raw materials B.C.
Labor a.d. 1000-1862 (ending slavery)
Raising & Educating labor 1800-1900
Industrial Accidents 1875-1925
Industrial Diseases 1900 onward
Pollution cleanup yet to be internalized
Pollution prevention yet to be internalized

Each episode required a fundamental cultural “value shift”, as the established order fought the change bitterly, claiming bankruptcy and ruination would ensue. Each time, the fight was long, often lasting until the old order simply died out or was forced aside, unable to see the light or admit the errors of their ways.

Hardin’s development of these ideas is worth reading. Drastically summarizing, he argues that “right to throw way” into the air, water or onto the land, is perhaps the last major externality yet to be fully internalized, noting that on the “spaceship earth”, there is no place that is truly “away”. The struggle parallels exactly the process of change that took place in every previous episode of internalization.

Perhaps he would be somewhat optimistic now, all these years later farther along in the struggle, that progress is being made.

I wrote these words in 1996. OK, probably a bit overly optimistic then, but an understatement if anything of what we’re seeing today:

Perspectives on Externalities

There is a world wide movement underway to begin thinking about externalities in new and many would say more enlightened ways, as an aspect of industrial, business and social activity that is no longer just the province of environmental idealists and idealogues. Major corporations are starting to realize profound economic implications (e.g. higher profits!) of taking a more comprehensive (holistic) view of production systems, and are adopting strategies that take into account, for example, the “cradle to grave” aspects of their products, from what resources are used to make them, to how they are used, to their ultimate disposal.

Ed Beardsworth is a long time fixture in the cleantech sector, is the Research Director of Cleantech.org and the Director of the Hub Lab. He was formerly with EPRI and Brookhaven, and has a PhD in Physics from Rutgers.

2010 Cars Deliver Performance and Fuel Economy

This is my first time to drive on a race track and I’m wondering if these are my final moments on planet earth. Here at the Mazda Raceway Laguna Seca I take the Andretti Hairpin and learn to accelerate in successive turns. After accelerating uphill, I enter “The Corkscrew” where I cannot see the sharp downhill turn to the left until I am in the middle of it. As I get into this sharp turn, I need to prepare for the sequence of curves that immediately follow. Yes, it’s a corkscrew.

I try to remember the coaching that I received. Hold the steering wheel with something less than a death grip. Breathe. Look ahead – but looking ahead at the top of the Corkscrew I only see blue sky. Looking ahead to my future, I only see darkness.

The 2009 BMW 335d that I am driving handles beautifully, offers more turbodiesel acceleration than I care to try, and I guarantee you that the brakes work.

After three laps, I exit the track, park the BMW, remove my helmet as I leave the car, and resist kissing the ground in front of real drivers. I have been invited to test drive new vehicles with the Western Automotive Journalists, even though I write about green cars and clean transportation. I long for yesterday.

Yesterday, I tested cars with good fuel economy on streets with posted speed limits. Drives included three cars that made the list of Top 10 Low Carbon Footprint Cars. Yesterday, the 20 mile test drives were along the ocean in Monterey and on beautiful tree lined roads where I could easily see the next turn.

The 2010 Ford (F) Fusion Hybrid easily seats five, has plenty of trunk storage, and actually delivers better mileage than the MINI due to Ford’s impressive hybrid drive system. The new Ford midsized sedan that I drove has an EPA certified 41 mpg rating in the city and 36 mpg on the highway. The base suggested price is $27,995.

It may prove to be popular with anyone considering the Toyota (TM) Camry Hybrid; Ford delivers equal room, safety, and comfort with better rated mileage. Although the Fusion Hybrid has a better mileage rating than the Camry Hybrid, that advantage is not always delivered in real world driving. Edmonds Test Drive

In theory, the Ford Fusion Hybrid can travel up to 47 miles per hour in electric mode; I could only sustain the engine-off mode when gliding downhill. Even on flat roads driving 25 mph, the engine would engage.

Ford does a nice job of encouraging drivers to get better fuel economy. The SmartGage had a display section that filled with green leaves as I drove with a light touch that reduced demands on the 2.5L engine. The Ford Fusion Hybrid delivered the smoothest driving experience of any hybrid which I have driven. I did not notice the transitions from gas to electric mode. The transitions were seamless.

Even better mileage was delivered by the 2010 Honda (HMC) Insight EX which I drove in Monterey. It is rated 43 mpg highway and 40 mpg city. The Insight’s combined EPA rating of 41 contrasts with the 2010 Prius expected rating of at least 50 mpg. The Honda Insight has an aerodynamic body similar to the Prius. Although the two five-door hatchbacks look similar, the Prius is a longer midsized car. In theory, the Honda Insight pricing starts at $19,800 which has pressured Toyota to offer a Prius with a base price only $2,000 higher. The 2010 Insight that I drove included upgrades such as a navigation system and six speaker audio system. The vehicle price, including pre-delivery service, was $23,770.

I started the Insight, and then touched the ECO button. Even in that mode, I had enough acceleration to get on any freeway in a hurry. The ECO mode helped me minimize demands on the 1.3L gasoline engine as I navigated the roads hugging Monterey’s dramatic coast. Like the Ford Fusion Hybrid, I was rewarded with a display of green leaves for my eco-driving behavior. Handling was smooth and a bit sporty.

Driving the Honda Insight was smooth and quiet even when I went up a sustained 16 percent grade, demonstrating that its electric motor is quite effective in blending power with the 98 hp engine.

Price will definitely be a factor in buyers deciding between the Honda Insight and the Toyota Prius. In some markets, such as California, another factor may be the ability to get an HOV sticker with the Insight. For my money, if I could get a larger more fuel efficient Prius for only $2,000 more, then I would get the Prius. On the other hand, if there was a $5,000 price differential at the dealer, then I would go with the Insight. All in all, both are wonderful cars.

If you want great fuel economy, few compromises, and driving pleasure, test drive the latest hybrids from automakers like Toyota, Honda, and Ford. The intensified competition between them is bringing better performance and safety and economy.

Complete Article including MINI Cooper test drive.

John Addison publishes the Clean Fleet Report and is the author of Save Gas, Save the Planet.

Gridlock Windblock

by Richard T. Stuebi

I don’t know if it’s a myth, but I’ve heard it said that a city’s suicide rates and average wind speeds are correlated. According to the claim, there may be something fundamental about human biology – perhaps within the inner ear – that makes windiness tend to drive people crazy.

Whether it’s true or not, it’s indisputable that, where there’s lots of wind, there tends to be few people. And, vice versa: where there’s a lot of people, there tends to be little wind.

A casual look at a U.S. wind map confirms this: most of the best wind resources are in the middle of the country, from West Texas in the South to the Dakotas in the North. If you’ve ever driven in any of these parts, you know that this is an endless expanse of desolate, sparsely-populated land.

Unsurprisingly, it’s also the case that, where there are few people, there tend to be few electric transmission lines. Logically, it follows then that there is little electric transmission capacity in the places where wind resources are greatest.

So, when parts of the Great Plains get touted as the “Saudi Arabia of wind”, it may be true, but imagine the need to build a big set of pipelines to get that useful wind energy to customers in Minneapolis, Chicago and points further East and South.

Ask any wind developer about their business prospects, and it doesn’t take long for the conversation to turn to transmission – or, more precisely, the lack of enough of it.

Look at the study “20% Wind Energy by 2030” released in 2008 by the U.S. Department of Energy to envision the implications of supplying 20% of the nation’s electricity needs by 2030 from wind. Oh, there’s plenty of wind to actually supply the electricity, no problem. It’s just that tons of new transmission capacity would be needed.

And there’s the rub. It’s only marginally easier to site and build a new transmission line than a new nuclear powerplant. Transmission lines take many years and sometimes even decades to get done, due to a variety of NIMBY forces and overlapping regulatory regimes at the local, state and federal levels. And, they cost a fortune, easily a million dollars a mile, often considerably more.

So, that “pipeline” from Dakota to Chicago is on the order of a billion dollars of merely enabling infrastructure – and since there are many pinchpoints in the national power grid, that wind power probably couldn’t go much further than the terminating point anyway.

(From a technical standpoint, I’m massively oversimplifying here by comparing the power grid to a commodity pipeline, but the gist of the conclusion is essentially sound.)

Last year, most of the transmission grid operators from the Eastern half of the U.S. convened for the first time (that’s scary, isn’t it?) to develop what has come to be called the Joint Coordinated System Plan (JCSP) 2008. The JCSP report suggests that 10,000 new miles of transmission lines, at an investment of about $50 billion, will be needed east of the Rocky Mountains over the next 15 years just to meet expected load growth and current renewable portfolio standards on the books. Little of this required expansion is much beyond the drawing board.

The JCSP’s 20% wind scenario is even more daunting: 15,000 miles and $80 billion of capital. The map associated with this scenario is especially intriguing, with three major new hypothetical 800 kV DC corridors drawn right across Northeast Ohio to New York City. (No doubt, the nightmare of the August 2003 Northeastern blackout still sends nightmares through these transmission planners.)

Sorry, I just don’t see this happening in my lifetime.

In passing, the authors point out that neither energy efficiency nor offshore wind resources were investigated to alleviate these transmission requirements. My guess is that inclusion of these possibilities would change the results – a lot.

Significant penetration of energy efficiency could probably seriously reduce the quantity of new wind generation required to make up 20% of the region’s supply. Instead of nearly 230 gigawatts (!) of projected new wind capacity in the Eastern U.S. by 2024, my guess is that concerted exploitation of cost-effective energy efficiency opportunities could cut that investment requirement in half.

As for the 100+ gigawatts of new wind turbines in the Eastern U.S., it might be cheaper overall to put higher-cost installations offshore in the Great Lakes and in the Atlantic to avoid facing the perhaps impossible prospect of building lots of expensive new transmission lines to import onshore wind from the Great Plains.

The inability to expand transmission is a major impediment to the onshore wind business, and while it might be mitigated (slightly) with some regulatory reform, I don’t see it going away. Offshore wind may have its own development challenges, but for those in the wind industry, going offshore should become an increasingly interesting way to skirt the gridlock problem.

Richard T. Stuebi is the Fellow for Energy and Environmental Advancement at The Cleveland Foundation, and is also the Founder and President of NextWave Energy, Inc. Later in 2009, he will also become a Managing Director of Early Stage Partners.

The REAL Story on Moore’s Law for Solar

All new industries seem to think they deserve a Moore’s Law. The photovoltaic solar really, really thinks it deserves one, since it kind of sort of looks like a semiconductor business: Photovoltaic Moore’s Law Will Make Solar Competitive by 2015, IEEE.org, Understanding Moore’s Law, DistributedEnergy.com, and Silicon Valley Starts to Turn Its Face to the Sun, NY Times.

However the nuances are mischevious. The cost implications of Moore’s Law at heart are built around a constant rate of technology performance improvement (2x transistors every 2 years), implying certain cost improvements. PV’s falling costs curves have had more variables at play. In fact, the real equivalent to Moore’s Law in solar would be to say that cell efficiency or a similiar measure doubles every x years. Most people have tried to apply a Moore’s Law like concept in solar directly to the cost curve, not the technology improvement curve. In fact, the solar costs “Moore’s Law” that seemed the simplest was the idea that every doubling of industry size equaled 10% in cost reductions. But that is not a Moore’s Law, that’s mainly just a description of the supply curve shape and shift, it’s a totally different animal.

I’ve been researching this topic for some time, trying to develop a simple conceptual model to understand falling solar cost curves and their impacts, and I update my cost analysis spreadsheets based on numerous inputs from energy companies, solar developers, solar integrators, as well as module manufacturers. I think I now have a simple, economically sound model with good explanatory power, that allows us to shed some light on why and how the cost curves fall.

We’ll call it the Dikeman Solar Cost Model – DiSoCo Model, and it’s somewhat simple and axiomatic: the value on the supply side = the value on the demand side, broken down into fixed, sticky, and variable components, by market segment.

Over the last couple of years, I’d argue that roughly half of the cost reduction in solar have come from massive increases in larger installations (primarily spreading NRE and installation cost across a larger projects at the installations, as well as dealing improved economics of scale in manufacturing), not really from solar costs themselves. And roughly the other half from actual technology cost reductions.

This is an important distinction as it means that arguably with say 2003 solar technology, if the subsidies and demand had been there to build a whole bunch of 10 MW PV farms, a similiar cost could have been achieved to today’s costs, at least within striking distance (as opposed to a Moore’s law industry where the fundamental technology performance curves would have been 8x better, with drastic cost improvements resulting). Technology costs haven’t necessarily fallen as much as we think, so much as the scale has changed, making costs look like they’ve fallen a significant amount.

And we have to be careful about making generalizations of the technology cost reductions, too. A large chunk of the technology cost reductions at scale (perhaps 50%?) have come from one company, First Solar, out of the hundreds that manufacture PV products. If you take them out of the equation, the falling technology cost curves don’t look so great.

But I’ll posit a cost reduction law for solar that may hold. Roughly speaking, the per unit solar industry costs at a system level fall every year in line with the reduction in per unit subsidies for the key solar subsidy programs in that year, adjusted for interest rates and margin changes. Because if they don’t, they don’t sell product.

Why? We argue that the market is basically willing to pay a set rate per kwh for solar that is reasonably constant over time. The underlying conceptual DiSoCo Model is this: the market’s set rate for solar + the cost of capital + the per unit subsidy = solar system cost + solar system embedded margin. My primary use of the model has been to break out each component, market by market, segment by segment, and analyze how fixed, variable, or sticky they are, to better understand their interactions as conditions change. If this is true, then for a given set rate, same interest rates as last year, then changes in the subsidy either come out of cost or margin. If margin were mature and fixed, then cost changes would equal subsidy changes.

We could extend the model by suggesting that changes in the market set rate is a function of retail and wholesale energy prices, and non direct subsidy programs like a RPSs and RECs, and non market based buyers willing to accept low equity ROEs. We could further extend it by suggesting that some subsidies, like the ITC, may manifest in the cost of capital, not the per unit of subsidy.

In a real life example, when the subsidy programs have built in per unit reductions in them over time or volume (like the Japanese industry maker did, and California does, and many of the FITs do), then the industry has to find a way to take enough costs out to match the reduction, otherwise the margin gets hammered. This suggests that market won’t actually see the cost reductions until the subsidy ends, except where the industry cost reductions exceed the subsidy reductions in a given period (in fact, this was true, and available manufacturing capacity seems to have a big impact on this component also, as for several years, the manufacturer’s didn’t pass on ANY technology costs reductions, but fattened margins and prices instead).

And extending on that, we realize that the swing variable has been manufacturer’s margin at the ingot/wafer, cell, and module levels, not cost, which has tended to be more fixed or sticky than we thought. And in a period of tight supply, as we had in the silicon refining shortage, margin goes up, all else equal, and in period of oversupply, where we are moving too, margin goes down, since the other major components (including, unlike the corollary to Moore’s Law, technology cost) are relatively fixed or sticky over short time frames. The market still only pays what it will pay per kwh, and the subsidies and interest rates are what they are, and so known coming reductions /volumes in per unit subsidies force the industry to find a way to take it out of costs, see margin suffer, or find new markets with new subsidies. Hence, the model allows us to posit the law that the real long term linkage is subsidy reductions to cost reductions, adjusted for swings in margins.

This would help explain the rise of the grid linked industrial market in California and Germany, effectively as a partnership between public policy, manufacturers with limited near term technology cost reduction potential needing economies of scale, and the rise of the PPA/developer model as the facilitator between the two, and explain the continual skinny economics for end users/PPA owners, despite falling costs.

We could further extend that last point by suggesting it can be applied niche by niche, country by country. And better understand the market by realizing that manufacturers, starting with the Japanese firms 5 years ago when the Japen rebates rolled off, and extending currently to First Solar’s and Suntech’s et al moves into power plant development, effectively applied this model on a country by country, niche by niche approach seeking new markets as the subsidies fall and move, in a bid to maintain margins while cost curves were steady.

So the DiSoCo Model is simple enough, it states that the value on the supply side = the value on the demand side, and when breaking the components out and evaluating market by market which are fixed in the short term and which are variable, it has seemed to us to shed some light on why the solar markets have moved the way they’ve moved. And it posits that a market set price exists segment by segment, and therefore that if margins are normal in that segment, reductions in the per unit subsidy levels roughly equal reductions in cost, and only when reductions in cost drastically exceed those of subsidy levels, can price be effected.

And it gives us a very different picture of falling cost curves and price implications than pretending Moore’s Law works for solar.

Neal Dikeman is CEO fo Carbonflow, Inc., a Partner at Jane Capital Partners LLC, the Chairman of Cleantech.org, and the founding contributor to CleantechBlog.com.

Cleantech Blog Power 5 – Top Investors in Cleantech

I’ve been warning about a massive mispricing of risk in cleantech investing for years.

Cleantech Venture Capitalists Beware – What You Don’t Know About Energy Can Kill You

Beware the Allure of Ethanol Investing

Is there a cleantech bubble? Experts don’t think so

That certainly doesn’t mean that cleantech investing is bad. On the contrary, I’m very very bullish on cleantech. The question is which cleantech investors are following my rules on what’s good about investing in cleantech, and which ones are just following the old style IT rules of venture capital and taking that mispriced risk for their LPs.

In the cattle business, a bad rancher judges the cow by the quality of cow, a good rancher judges the cow by the quality of the calf. That’s how this Power 5 Ranking and Big 5 Question Mark Ranking of cleantech investors was constructed. Quality of the calf.

The Cleantech Blog Power 5

  1. GFI Energy – The top private equity shop in cleantech in my opinion. Caminus, Noreseco, Xantrex, et al. Been doing it quietly for over a decade creating great companies. A shop that doesn’t miss often, and doesn’t bother to show up at the cleantech conference circuit. Maybe they don’t need to.
  2. MissionPoint Capital Management – SunEdison, Ecosecurities, APX et al. Great discipline, great picks. They actually seem to know something about the areas they invest in.
  3. Clean Pacific Ventures – Early stage, see things others are going to see about 4 months before they do. Backed one of my companies. Show the love.
  4. Acorn Energy – The place where Comverge was borne. Publicly traded, now investing in cleantech. I love this portfolio. John Moore has a nose for deals. His card says “CEO and Evangelist”. Most people will ignore him because he’s publicly traded. But if it works, so what?
  5. Goldman Sachs – Their name is on or in half of the marquee deals in the sector from First Solar to SunEdison, Horizon Wind, Suntech. Hard to leave them out.

Honorable mention goes to the AIM market. The whole market. It’s better for founders, better for investors, took HUGE market share from the venture capital community in cleantech. All around eating VC lunch for breakfast. And yes, there is liquidity. Stop saying there’s not in the same breath you ask me to sell you preferred stock with cosale rights. It’s obnoxious.

And the Big 5 Question Marks

  1. KPCB – Bloom Energy? EEStor? 5 different stealth thin film plays? Et al. How many stealth science projects in cleantech can dance on the head of a pin? Let’s work on a very mixed metaphor/cliché of sorts – you shall not crucify this crown of venture upon a cross of cleantech. Too many of the technologies in Kleiner deals are only sexy because Kleiner’s name is attached. Come on guys, you’re better than this.
  2. Google.org – The world is rooting for you to succeed. And Silicon Valley needs a poster child for cleantech. How about articulating a strategy that the market understands? Maybe “sustainably energizing the web” or some such? When people ask me what does Google.org invest in and why, there should be a clear answer.
  3. Khosla Ventures – How many odd ways are there to invest in ethanol? Do we really think being in refining is a good business? And no, it’s not cheaper than gasoline. Can we lobby our way out of it? There are some gems in here, but the weighting may catch him. Kudos though for doing it with large chunks his own money instead of my grandmother’s pension fund’s money.
  4. VantagePoint Venture Partners – The anti Kleiner? Lots of strategics involved, and taking very, very large, very, very risky bets. Perhaps they better hope Vinod’s lobbying comes through. But it only takes one, right? If they can find the discipline of the Power 5, this could be good.
  5. Nth Power – Where’d you go? You were the acknowledged market leader when cleantech started in the first part of this decade. At one time virtually every strategic that mattered was an LP. The cleantech market needs you to be bigger than you are today.

So yes, invest in cleantech. But pay attention to the risk not just the management fees when it’s OPM.

Neal Dikeman is a partner at Jane Capital Partners LLC, and Chairman of Carbonflow and Cleantech.org. And he has the utmost respect for the guys behind these firms, regardless of whether or not he thinks their investment strategies are pricing risk well.

Cleantech Blog "Power 10" Ranking Vol II 2009

Last year I did my first “Power 10″ ranking for 2008 of cleantech companies, and the response was so good we’re doing it again.

I spend most of my day meeting and talking to companies in the cleantech sector. And those of you who know me know I have opinions on who is doing it right, and who is doing it wrong.

As before this is the Cleantech Blog Power 10 Ranking of cleantech companies doing it right.

Eligibility for inclusion in the ranking requires meeting a 6 point test. Suggestions for inclusions in future volumes are welcome. The 6 point test:
1. The company is energy or environmental technology related
2. I like their products
3. The market needs them
4. The company is smart about building their business
5. I’d like to own the company if I could (for the right price, of course!)
6. It is not already one of mine (my apologies to my friends Zenergy Power)

I have included cleantech companies big and small.

  1. Sharp – Makes the list again as top dog battling to hold its crown in solar PV. Keep on trucking.
  2. GE – Their M&A strategy delivered venture like returns, and they still hold power positions in wind, T&D, clean gensets, and water capital equipment. Hard to dethrone.
  3. Iberdrola – Barely didn’t make the cut last year. Largest wind operator in the world now. Deserves it.
  4. First Solar (NASDAQ:FSLR) – Still the low cost producer in PV and growing. Smart move swapping expensive stock for the Optisolar project pipeline. Keep those factories full!
  5. Goldman Sachs (NYSE:GS) -The only investor to merit consideration, but area a part of too many power plays in cleantech to leave off this time.
  6. DNV – Their auditors underpin roughly half of the carbon markets. In carbon, audit and verification is everything. Their market share slipped some, but they hold their crown as the only one of the big carbon auditors yet aggressively investing in the US.
  7. Applied Materials – The future of thin film if they can deliver on their strategic moves. But I need to see some of your customer’s production taking serious market share, or making next year’s list could be tough.
  8. Cleantech Group – The business is now definitely more than just a conference operator. Despite massive competition in conferences (long a cash cow for them), the Cleantech Group hasn’t lost its footing as the preeminent brand. And now seems to be learning how to play well with others. Great job guys on both creating an asset class AND building a cool company.
  9. Bayard Group/Landis Gyr – Smart grid is the big cleantech play along with carbon and solar. Bayard, now branded around Landis Gyr, is a global Metering/Smart Grid roll up powerhouse. Bought Cellnet, Hunt, Enermet, and Landis Gyr et al.
  10. Valero – Texas refiner’s acquisition of VeraSun and move into renewable fuels gets it the nod. Now where to from here?

Honorable mention to Zenergy Power plc (AIM: ZEN.L), one I helped cofound. I couldn’t resist this year since the team is making hay off of fault current limiter technology we bet on in 2004, and deserves the nod. Also to Smart Fuel Cell (XETRA:F3C.DE) – Still the most mature fuel cell company in the world by a mile. But revenues flattened in 2008 and it made no moves allowing it to stave off the newcomers to Power 10. 2009 is the make or break year. And finally to Sindicatum – Mover of the year in carbon in 2008. Raised a warchest into the teeth of a tough carbon market. Now we’ll see what they can do with it.

Also on our watchlist for next year: Abengoa, Acciona, SGS, Duke Energy, SoCal Edison, Origin Energy, Ecosecurities, Q-Cells, SunPower, Oerlikon, ConocoPhillips, BP, Shell.

Of note, no CIGS or solar thermal this year. The list is indicative of a shift towards carbon and projects. Still no cellulosic, and I can’t bring myself to add EVs to the Power 10 until somebody shows something real. Perhaps the 2013 list?

Neal Dikeman is a founding partner at Jane Capital Partners LLC, a boutique merchant bank advising strategic investors and startups in cleantech. He is founding contributor of Cleantech Blog, Chairman of Cleantech.org, and the Chairman of Carbonflow, Inc..

Superconducting Blackout Protection Device for Smart Grid

Today, Zenergy Power plc (AIM:ZEN), a company I am a cofounder of, announced that ConEd, one of thought leaders in the utility sector on transmission & distribution technology (conventional wisdom says they have to be, as given its tremendous load in a small area, the Manhattan grid is devilishly tricky to operate), has agreed to a deal to put in a new kind of fault current limiter, using high temperature superconducting technology.

This is hit number two in FCLs for Zenergy, which last month announced the first ever HTS FCL implementation into the grid with SoCal Edison, another of the global utility thoughtleaders.

Neal Dikeman is a partner at Jane Capital Partners, the editor of CleantechBlog.com, and Chairman of Carbonflow, Inc. and Cleantech.org.