Posted By Zen Kishimoto,
Tuesday, May 14, 2013
| Comments (0)
Smart grid is where power, IT, and
communications meet. In this blog, IT and communications technologies are
grouped as ICT. These days, most industry areas have become so complex that we
cannot cope with problems without applying ICT.
When smart grid was first introduced, Cisco
declared that the power grid would be much bigger than the Internet. From the
data point of view alone, the amount of data produced and processed on the
power grid is on a scale that none of us has experienced before. And with
more-sophisticated monitoring technologies, the volume of data will even
increase. The data collected may include equipment health, power flow, and
quantity of power consumption. Simply collecting data does not do much good. We
need to process what we collect—make heads and tails of it—to produce useful
information for better operation and maintenance. This is the Big Data
problem that is getting a lot of attention these days in ICT and other
Usually, Big Data problems are due to the
proliferation of SNSs, such as Facebook, Twitter, and LinkedIn. But with the
advent of low-power and low-priced, yet very sophisticated, end devices and sensors, different kinds of Big Data problems are
emerging, such as the one I just mentioned.
There are several companies that apply
their software systems and tools to solve Big Data problems in a particular
vertical market, such as the power industry. When I was covering data centers
and their energy efficiency, I visited
at its San Leandro, CA, headquarters
in 2009. They collect data sent by end devices like sensors and their equivalents and store, analyze, and
visualize the collected data to take appropriate actions for improving
operations. Since that visit, my focus has expanded to include the power
industry, which is only one of the markets OSIsoft addresses (see the other markets here).
Recently, I had an opportunity to attend
their users conference in San Francisco.
I listened to several representatives of
utilities and others in the power industry talk about their use of OSIsoft's
PI system. I also talked to Dave
Roberts, Fellow and market Principal – Smart Cities, who is an expert in
the power industry.
The following is my summary of our
discussion, with my comments.
Some power grid basics
I am targeting this blog to very, very IT
people and not to power people. So I think very simple, basic information is
useful. The power grid is a big connected network of power lines. The power
grid consists of two types of grids: transmission and distribution. Generated
power is transmitted at a very high voltage via transmission lines to
neighborhoods of consumers. Then the high voltage is transformed to much lower
voltage, and power is delivered to consumers like you and me via the
distribution grid. Because power must be consumed as it is produced, demand and
supply need to be balanced all the time. Power on transmission lines is managed
by each utility or by organizations called ISOs/RTOs
(independent of utility companies) to make sure the balance of demand and
supply is maintained—to keep the lights on. Also, as with computer networks, it
is important to know the health and status of each device and all the equipment
hanging from the grid. As in computer networks, such information is collected
from multiple places in the grid. The number of collection points grows as more
technologies are developed.
What OSIsoft does
Although from my conversations with other
OSIsoft people, I knew what business they were in, I just wanted to make sure
who they are and what they do. They provide a software infrastructure system
to connect remote devices, gather/collect/aggregate data from them, and store
and retrieve the collected data for further analysis, such as data analytics
and visualization. They do not provide end devices like sensors or analytics
engines. In other words, PI is one of the important components of the Internet
of Things, M2M, or intelligent systems. Different people define the
Internet of Things, M2M, and intelligent systems slightly differently, and
the terms are often used interchangeably.
Here's an oversimplified view of PI
My view on the conceptual view of PI
PI is not an operating system but there is
some analogy between PI and Windows. Windows provides a base operating
environment for applications to run in. Microsoft in general does not provide
any applications packages but provides this base plus some tools/utilities and
libraries via APIs. Third parties exploit this platform to write applications.
PI is similar and does not provide applications, including data analytics
packages. So PI can be said to be a general platform and applications area
This will continue to Part 2.
Posted By Zen Kishimoto,
Friday, February 01, 2013
| Comments (0)
This blog continues my reporting on the Smart
Energy Enterprise Development Zone’s second workshop on power quality (PQ). In
this blog, I will report on PQ and data centers and power network simulation
and visualization. Finally, I will discuss a proposal to share PQ information
to improve PQ for consumers in SEEDZ.
Power quality and data centers
Dennis Symanski of EPRI is an expert in data
center energy efficiency. I have talked to him before.
His role has expanded to include smart grid. A data center is a special
building where ICT and facilities equipment meet. ICT equipment is very
sensitive to power quality. Dennis presented very useful information, such as
what types of power quality events affect a data center, as in the following.
He also showed a set of mediation methods
for power quality events:
feeds to power supplies
These are more or less to mitigate power
Power network simulation and
Peter Evans of New Power Technologies gave a
presentation about their Energynet technology for simulating and visualizing
Peter Evans of New Power Technologies
Their system has been used in conjunction
with PG&E, Silicon Valley Power, SCE, and SMUD. Energynet provides the
Peter showed several examples of power
networks and said that power quality is affected by many factors. One is feeder
length. The longer the feeder, the more voltage sag may be caused towards the
endpoint. He showed interesting graphs, like the one below.
The y-axis denotes the voltage and the
x-axis indicates the distance from the substation. At the substation, standard
voltage is measured. As you go away from the substation and towards the end of
the feeder, voltage starts to sag. But at some point, voltage is jacked up
again with a voltage regulator. From this, we can make some interesting
observations. If you happen to live just before the voltage regulator, you may
have constant low-voltage problems. If a commercial or industrial building
happens to be at that point, those in that building may suffer from chronic low-voltage
or voltage sag problems in operating their machinery. Depending on where you
are situated, the same PQ event can have totally different impacts. Peter said
that by collecting different power quality data on the same feeder, we might
gain a more accurate view of what’s happening to that feeder.
He concluded his presentation with the
Finally, Ralph Renne of NetApp proposed that
the participants share power quality data.
Ralph Renne of NetApp
A follow-up meeting is scheduled to
implement the sharing of the data.
data center power quality
Posted By Zen Kishimoto,
Thursday, January 31, 2013
| Comments (0)
Following the Smart Energy Enterprise
Development Zone’s first
workshop on power quality two weeks ago, a second was held recently. Like the
first, the second was full of tutorial-like presentations, with a proposal to
the attendees to share their power quality information with others.
EPRI’s presentations covered the subject
matter comprehensively. In this blog, I discuss those; I’ll cover the other
presentations in part 2.
Mark Stephens of EPRI gave presentations on
this subject as well as on power quality (PQ) standards, embedded solutions,
and multilevel approaches.
Mark Stephens of EPRI
Equipment consists of many smaller
components, and each component may behave completely differently during a power
quality disturbance. Mark listed the equipment components in the following.
Types of components that typically constitute
Frankly, Mark covered a subject that was
over the head of many attendees, and I do not want to go into too much detail. I’ll
keep it at the layman’s level. We want to know how each component tolerates
power quality events like voltage sag. If a component tolerance level is too
low, a slight voltage sag may fail or trip that component. Of course it is
important to trip the component to protect the rest of the equipment. But if it
is oversensitive, all the equipment may halt unnecessarily in a slight voltage
sag, causing a long shutdown time.
Mark used an Information
Technology Industry Council (ITIC) Computer and Business Equipment
Manufacturing Association (CBEMA) curve to describe equipment sensitivity. I
just want to show one such diagram so that I can make it a reference point for
the remainder of the blog.
In this graph, the y-axis denotes the
tolerance level in terms of voltage sag rate in comparison with nominal
(standard) voltage. For example, 80% means that if 20% of the standard voltage
sags, the component fails. The x-axis indicates the duration of the power
quality event. The failure or trip rate goes up as the power event continues.
In the graph above, a relay called ice cube relay failed almost
instantly when voltage went down to less than 75% of the nominal voltage level.
By simply replacing an ice cube relay, many power quality events can be
tolerated without damaging equipment. Equipment too sensitive to PQ events
tends to shut down unnecessarily and may require a multimillion dollar recovery
Mark covered characteristics of PC power
supplies, lighting relays, contactors, starters, semiconductor tools, DC power
supplies, PLCs, motor drives, and chillers in some detail. But all that is a
little beyond the scope of this blog.
Power quality standards
Mark continued the presentation on power
quality standards. Voltage sag is the most common power quality event, so it is
a good idea to make sure that your equipment is certified with standards for
He cited the following voltage sag standards.
Voltage sag standards
Mark elaborated on each standard by
comparing them. The details are too much to recap here, but it is clearly a
good thing to have a set of standards. At the same time, different
organizations define their own versions, which have both similarities and
differences. They do, however, have something in common: none address
three-phase voltage sag. I suppose that is why we need a consultant like EPRI.
Power quality can be addressed at several
different levels, as shown in the following figure.
Solutions at utility, whole plant,
panel feeder, machine, control, and embedded levels
Because it is usually prohibitively
expensive to implement solutions at the larger scale, it makes sense to
mitigate the power quality event at the embedded level.
Mark showed several methods for the embedded
solutions (again, the content was too rich to be summarized here):
- Method 1:
design with DC power.
- Method 2:
use voltage sag–tolerant components.
- Method 3: apply custom programming techniques—use delay filters,
state machine programming, phase/voltage sensing relay.
- Method 4:
examine configuration settings.
- Method 5:
select appropriate trip curves for circuit breakers.
Mark’s final presentation was on multilevel
approaches. Again, the power event problem can be addressed at different levels
with different costs.
His point was that voltage sag can be
mitigated at various points, including service entrance, panel feeder, panel,
machine, and control levels. Each has its cost and advantage and disadvantage.
In concluding all his presentations, Mark
said the following.
Ice cube relay
Posted By Zen Kishimoto,
Tuesday, January 22, 2013
| Comments (0)
I come from the ICT field, and many of
my friends and the people around me are in the same field. So I think
like an ICT person and behave like an ICT person. When I started to
cover smart grid, I had opportunities to talk to people in the power
industry who are very different from those I know from my ICT circle.
One of my contacts in the power
industry with whom I get along very well is Lew
Rubin. He was formerly with EPRI and has been a
very knowledgeable and excellent consultant in the power industry.
Recently, he invited me to a biomass power plant he is helping to
start up. In California and other places, biomass is considered a
renewable energy source. The
renewable portfolio standard (RPS) in California
requires that at least 33% of all power be generated from renewable
energy sources, such as biomass, by 2020. I had some idea about the
other sources but did not have a good feel for biomass.
I read some of the EIA’s information
EIA’s estimation of biomass
resources shows that there are 590 million wet tons (equivalent to
413 million dry tons) of biomass available in the United States on an
annual basis. Historically, biomass consumption for energy use has
remained at low levels, although it is thelargest
nonhydroelectric renewable source of electricity in the United States
(considering both industrialcogeneration from biomass and
electricity sector generation). The main impediment has been the cost
of obtaining the feedstock. Of the estimated total resource of 590
million wet tons, only 20 million wet tons (equivalent to 14 million
dry tons, or enough to supply about 3 gigawatts of capacity) is
available today at prices up to $1.25 per million BTU.
Biomass use for power generation is
not projected to increase substantially by 2020 in the AEO2002
reference case because of the cost of biomass relative to the costs
of other fuels and the higher capital costs relative to those for
coal- or natural-gas-fired capacity. Slightly more growth is
projected in the high renewables case, but the difference from the
reference case projection is relatively small. In the 20% RPS case,
significantly more use of biomass for electricity generation is
projected than in the reference case, because electric utilities
would be required to generate a portion of their power from renewable
resources, including biomass.
This is good information but still very
abstract. I thought it was a great opportunity for me to visit a real
power plant fueled by biomass.
The plant is in the city
of Anderson, about a four-hour drive from the
San Jose area. I spent eight hours in the car with Lew and learned a
lot from him about the plant and other issues with the power
industry. Those that were not about the plant will be my topics in a
Biomass Power Plant
Fuel and combustion
The plant is called Anderson Plant and
is next to two lumber mills. Biomass power generation uses wood chips
as fuel. Beforehand, I did not know in what form the fuel is fed to a
boiler. It is in the form of wood chips, which can be made with a
That may be likened to a large pencil sharpener that takes in pieces
of wood of various sizes and chips them into smaller bits.
Through a deal with the adjacent lumber
mill, Anderson Plant receives wood chips as well as sawdust
regularly. In addition, other suppliers bring "fuel” to them.
Once fuel is delivered, it is piled up over a multi-acre fuel yard as
in the following photo. Kevin Hurte, plant manager, and Lew are
standing in front of the pile.
From left: Kevin Hurte and Lew Rubin
They need to make sure that the fuel is
in good condition—without foreign material, like dirt, and neither
too dry nor too wet. In the case of dirt or other foreign material,
all they can do is to select a vendor that consistently delivers good
fuel. This is because it is impossible to clean dirty fuel. If fuel
is too wet, it still combusts, but it takes extra BTUs to burn it, so
it is not economical. It is also important to mix wood chips of
various sizes well so that the mixture is uniform. When chips are
well mixed, the surface is flat and they burn well. For mixing, a
front-end loader like the one in the following picture is used.
The same front-end loader also moves
the chips from those piles to a hut attached to the building where
the boiler is located. Then three joggers go to work to distribute
the fuel evenly.
Looking at three joggers from the other
side of the hut.
The fuel is moved again via a conveyor
belt to the top of the boiler.
The conveyor belt lifts fuel to the
boiler to burn it.
Then fuel is fed into three boiler
entrances on the boiler top. As the fuel is burned, it heats water to
produce steam. Some of the steam is made available (viacogeneration)
to the adjacent lumber mill to dry their finished lumber. The steam
is also used to turn a turbine, which rotates a generator to produce
power at 12 kV voltage. Water circulates through the boiler inside
boiler tubes. If water in the tubes is not maintained at a pure
quality, the tubes can get damaged. Therefore, sophisticated
equipment is attached to the boiler system for monitoring and
Power generated and consumed
This site has been a thermal power
plant since the 1980’s, and a 12 kV PG&E distribution feeder
passes right by the entrance to the plant. Because the generated
power is at a compatible voltage, they do not need a sophisticated
switchyard to step up voltage to the distribution system level. The
same distribution line is used for sending generated power and for
obtaining power for the plant. When they are generating power, they
use some of it for their own consumption and send the rest to PG&E.
But when they are not generating power, they need to obtain power
from PG&E via the same line. At the entry point, there is a
CAISO-certified net meter to track what was sent and what was
consumed. This is a very simple structure. The power generation
capacity is 5 MW but an increase to 6.5 MW is planned. Although this
is not very big, it is enough to power a good-sized data center.
Power generated with biomass is
classified as renewable, but unlike solar, wind, or hydro, it
combusts fuel and produces some emissions, including CO2,
into the air. This plant is a repowering of a former thermal power
plant, and thus the connection to PG&E and other infrastructure
was available. However, they made a number of improvements, including
rehabilitating the boiler and the generator. On top of that, they
need to install monitors for air and ash quality and invest in other
environmental controls. The cost for that accounts for about 80% of
all the cost of renovation.
Continuous Emissions Monitoring (CEMS)
is a system that automatically monitors and records emissions. For
example, when nitrogen
dioxide density is too high, inserting urea converts it to elemental
nitrogen. Anderson Plant must submit its records monthly to Shasta
County. The system is installed on top of the boilers, and they let
me climb up there. Even though I enjoyed the view of Mt. Shasta (see
below), it was a little scary to stand in the open with little in the
way of protective fences and railings.
Ash is an inevitable result of wood
burning. If the pH
of the ash is too alkaline, the ash must be sent to a landfill. If
the pH is within an acceptable range, the ash can be used by farmers
as soil amendment for their crops.
Ash is recovered and moved to a truck
Finally, for the slide show with more
equipment by Lew Rubin, click here.
Physical vs. virtual
ICT is a logical or virtual world. Even
when I read about power generation, it was logical for me. But this
tour let me see a real physical entity and the real people who manage
it. I also visited their control room. Much of the equipment is
controlled in a semiautomatic fashion. More-sophisticated computer
control can be introduced, but it might be overkill for a small plant
Posted By Zen Kishimoto,
Wednesday, January 16, 2013
| Comments (0)
Venture Silicon Valley Network
(JVSVN) kicked off its Smart Energy Enterprise Development Zone
(SEEDZ) initiative with the first two workshops on power quality
(PQ). PQ is very important for the consumers in the zone because it
impacts manufacturing and operations of sophisticated high tech
products and equipment there.
Don Bray, JVSVN Executive Director of
SEEDZ, opened the workshop and explained its objectives.
There were six speakers:
Bill Howe: Because PQ may not be
well known, he gave three very informative talks on different
aspects of it, including what the Electric Power Research Institute
working on in this area.
Jerry Hutchinson and Frank Arroyo:
they gave the power supplier’s view of PQ.
Ralph Renee: He presented what
a consumer, has been doing in terms of PQ from a user’s
Andy Taylor: As a consultant from
Power Technologies, he emphasized the importance of
After brief remarks from Marek Samotyj,
Bill dived into three topics: PQ basics, EPRI's research on PQ, and
Three elements are associated with PQ:
An even more detailed classification
was given in the following table.
(*The layman's term is electrocution.)
PQ classification (Source: EPRI)
Bill also talked about relevant
standards for PQ: IEEE 100, 1100, and 1159. (ZK: One
of PG&E’s pages has a comprehensive list of
standards for PQ). Voltage sag and swell are by far the most common
(48%) PQ events, with harmonics a distant second (22%), as indicated
in the following.
Their research in PQ is summarized in
page. Briefly, it covers:
It is certainly important to include
the economic side when PQ is considered. Bill discussed several
methods and practices to assess the economic impact of PQ events, but
I will not go over them here. You can find out more about what EPRI
is doing in the area of PQ here.
I just want to show the following to consider how we can improve PQ.
Cost resulting from PQ events must
balance with the cost to prevent them.
Jerry Hutchinson and Frank Arroyo
talked about the transmission and distribution substation for the
zone. They also defined two terms, sustained outage (more than five
minutes) and momentary outage (less than five minutes). PG&E has
invested $100M in the transmission facility in the South Bay (where
the zone is) for the past 10 years and plans to invest another $200M
over the next 10 years.
They classify PQ events as:
Voltage sag is the most common. These
interesting statistics are associated with it:
Voltage sag happens 7 to 8 times
as often as outages or momentary interruptions.
80% of sag is less than 10 cycles.
The magnitude is greater than 60%
of nominal voltage.
PG&E provides consulting for power
quality, as described here.
I also found their pages (here
PG&E gave the supplier's view of
PQ, and EPRI gave the consultant's view. NetApp's Ralph Renee gave
the consumer’s point of view.
NetApp's Sunnyvale campus:
Consists of 13 buildings with a
total floor space of close to 1.6M square feet.
Has three data centers and labs.
Requires more than 10 MW at peak.
They are keen on energy efficiency.
Nine of their buildings are certified with EPA's Energy Star and two
are certified with LEED. They monitor their power use throughout the
They use very sophisticated meters (ION
8600 and ION
6200) from Schneider
Electric. Ralph's proposal was to make volunteer
companies' measured power information available to each other so that
everyone can compare notes for their PQ. If two companies get power
from the same substation, they can further tell if a PQ event
resulted from internal problems or was caused by a disturbance by the
Applied Power Technologies
Andy Taylor seconded Ralph's proposal
for sharing power quality information among volunteer companies. Both
PG&E and Silicon
Valley Power (part of the City of Santa Clara) issue
PQ problem event alerts. Consumers' information can be incorporated
to make these even more useful. He did not think that sharing PQ data
from volunteer companies would be hard to do, but the organizational
problems will need to be resolved.
A lot to learn but I am looking forward
to the second workshop on January 23.
Joint Venture Silicon Valley Network
Posted By Zen Kishimoto,
Tuesday, January 15, 2013
| Comments (0)
utility industry in the US as well as elsewhere in the world is going
through a major change, which might be called a revolution. Till now,
power has been supplied by utilities to consumers in one direction. A
utility reads a meter and charges the consumer for power used. With
the advent of distributed generation and smart meters, power and
information now flow in two directions, and that creates challenges
and opportunities for utilities and consumers alike. Because the
power industry and its infrastructures are complex, change cannot
happen overnight. There are eight
experimenting with smart grid and microgrid in the US right now, as
shown in the following table.
of the microgrid projects in the US
none of those are in Silicon Valley, a center of new technologies and
entrepreneurship. But now the Joint
Venture Silicon Valley Network
(JVSVN) that provides analysis and action on issues
affecting our region's economy and quality of life,They announced the
Energy Enterprise Development Zone (SEEDZ)
last October. It is a Silicon Valley version of smart grid/microgrid,
and its details are described in their white
following map shows the smart grid zone they chose.
SEEDZ area outlined by bold orange
This area contains mostly commercial
consumers that require large amounts (up to 200 MW) of stable and
reliable power, including the City of Mountain View, the City of
Sunnyvale, NASA Ames, Yahoo!, Google, Juniper, and NetApp. PG&E
participates as a supplier utility.
uses the term smart energy instead of smart grid or microgrid because
the latter terms tend to express the supplier side’s innovation.
The comprehensive term smart energy conveys more nuance. There are
many issues to resolve to realize smart energy, so JVSVN selected
three areas to focus on (pg.16 in the white
Power-quality information sharing:
Sharing of power-quality measurements from customers to identify
distribution problems and guide investment.
Inventory of smart energy
practices: Developing and sharing smart energy practices to
accelerate the adoption of smart energy solutions.
building energy management system specification: Developing a model
specification with smart energy–enabling capabilities.
Part 1 of
workshop for power quality took
place recently. Part 2 is scheduled for January 23.
Joint venture Silicon Valley Network
Posted By Zen Kishimoto,
Saturday, January 12, 2013
| Comments (0)
appointed US Ambassador to Japan by
President Obama about three and a half years ago,
has been a very effective ambassador. He was recently in town as part
of his western US tour to celebrate the strong ties
between the US and Japan. Direct flights between the two countries
link Tokyo and five US cities—Boston, Seattle, San Diego, Denver,
and San Jose. San Jose was the last to be connected with All
Nippon Airways as of January 11, 2013.
Actually, there was a direct flight between San Jose and Tokyo/Narita
by American Airlines, which stopped the service in 2006.
at the podium with Mayor Chuck Reed
Roos is no stranger to the Bay Area. He grew up in here
and graduated from Stanford Law School. Incidentally, Mayor Reed
revealed that he was his classmate at the law school.
Ambassador Roos was
CEO of Silicon
Valley–based law firm Wilson
Sonsini Goodrich & Rosati before he was
Reed has been in many clean tech meetings and emphasized the growth
of business with entrepreneurship in San Jose. For more details,
check with San
Jose Green Vision.
President and CEO of Silicon
Valley Leadership Group (SVLG),
gave a speech, as did the Japanese Consul General
in San Francisco, US embassy staff, and others.
have been involved in several of SVLG's activities. SVLG deals with
many issues to make Silicon Valley a better place to live and work
in. Certainly, the
new direct flight from San Jose to Tokyo welcomes an even closer tie
with Japan, the third largest economy in the world.
Guardino of SVLG
following is a summary of Ambassador Roos's speech, with my comments
(indicated by ZK).
ambassador began by saying how closely the US and Japan have aligned
in the area of security and economy. After all, with the US the
number 1 economy and Japan number 3, the close collaboration between
the two countries is good for the entire world. The close
collaboration is in effect at the government-to-government level, as
in the smart grid experiments in New
On the way is laboratory-to-laboratory collaboration, as with
Renewable Energy Laboratories.
Roos then talked about Japan's nuclear disaster. I have reported on
this disaster in several previous blogs. Roos
said that Japan had decided to increase its dependence on nuclear
power from 30% to 50% before the disaster. But after the disaster,
Party of Japan
(DPJ), the ruling party then but the loser of a general election last
December and no longer in power, decided to phase out reliance on
nuclear power by 2030 and increase the generation of
power by renewable
energies to as much as 30% of the total.
Renewables now generate 10% of Japan’s power, and hydro produces
80% of that; other sources, like solar and wind, account for less
than 2%. With this policy change, the DPJ projected the renewables
field may grow to be a $600B market by 2020.
Democratic Party (LDP),
returned to power with DPJ’s defeat, may reconsider this policy.
However, the FIT
is on for 20 years, regardless of who the administration is, and the
ambassador thinks the renewables market will grow in such areas as
solar and smart grid.
Jeff Miller, Energy Attaché
of the US Embassy in Tokyo, said that the new administration probably
would not release its policy on energy until summer. He did not say
reason is that the LDP now has a majority in the Lower House of the
Diet, which is similar to the US Congress, but does not have a
majority in the Upper House. And they probably would like to avoid
any controversial issues until an upcoming Upper House election in
ambassador then said that it was important to plan and conduct
business with Japan for the long haul. He also said that he saw a
strong new trend in entrepreneurship in Japan since
the disaster of March 11, 2011. At the time of the disaster, the US
deployed 24,000 soldiers to give a hand to disaster-stricken areas
and people. The operation, known as Operation
was a success, and people in the disaster area really appreciated the
Operation Tomodachi has become the Tomodachi
which attempts to more closely link young people in both countries
in the areas of education, culture, and entrepreneurship.
of the people who spoke after the ambassador was Hiroshi Inomata,
Japan’s General Consul in San Francisco.
Inomata, General Consul of Japan in San Francisco
echoed the ambassador's message of the close collaboration between
the two countries beyond clean tech issues. He said that Japan is
uniquely positioned in the APAC region and can be a launching pad
into the rest of the Asian markets because
innovation hub of new research and R&D
business platforms consisting of favorite business environments, a
safe society, and good transportation
rich domestic market
the rest of this blog,
I only report
some of the things I heard from other speakers. I am sure that I
missed some other worthy comments.
US embassy attaché
listed some promising areas of clean tech that Japan may want to
(ZK: Because Japan is an island nation surrounded by oceans, there
is good potential for this type of generation. However, it is still
many years before it can be
put into production, and it will cost a lot of money to implement. I
am skeptical about whether this is suitable for a private company to
tackle with without the backing of large companies and/or the US
two frequency areas. (ZK: As the attaché
pointed out, there are two major power grids serving the eastern
(Tokyo and Yokohama) and western (Osaka and Nagoya) parts of Japan.
The AC power in the eastern part is 50 Hz (as in most of Europe),
whereas the western part uses 60 Hz (as does the US). Because of
this separation, excess power in one grid cannot be utilized for
another. See my old blog for the Japanese
power grid infrastructure.
One such solution can be the application of the technology used at
to unite three major power grids in the US. The
three grids all run AC power in 60 Hz but are not synchronized and
cannot be connected directly. So at Tres Amigas, each AC is first
converted to DC then reconverted to AC and connected to the other
grids with synchronization.
vendor in the smart meter segment asked for advice about what they
can do to grow their software sales in Japan. He was saying that
utilities like TEPCO
tend to purchase software from Japanese vendors over foreign vendors.
Wearing my second hat, I assist US companies to enter the Japanese
market, and I encounter this problem constantly. Think of it
this way. If you were a US utility company and needed to purchase
software, would you prefer to buy it from a US vendor or a foreign
one? The answer is very straightforward. In order to sell in Japan,
you need to overcome name recognition, marketing and technology
documents in Japanese, technical support in Japanese, contracts and
other agreements in Japanese, in addition to the Japanese language,
business etiquette, and other things. Even in the world of IT, it is
often hard to penetrate into the market. As in the US, utilities are
very conservative in Japan and do not want to run the risk of
adopting a technology from a foreign no-name vendor. There is a
solution for that, but it is beyond the scope of this blog.
San Jose City
Posted By Zen Kishimoto,
Friday, October 12, 2012
| Comments (0)
As I was browsing through the news in
Japan, I found an interesting keyword, dry
cask storage. There are 54 nuclear reactors in
Japan; four of them were destroyed and are going to be demolished.
Two are in operation, leaving 48 active but not in operation. Each
reactor has a pool to cool nuclear fuel rods. Without cooling, a
reactor would surely overheat and explode.
The earthquake on March 11, 2011, did
not destroy the Fukushima-Daiichi reactors. They withstood such a
major quake. The tsunami that followed did not damage the reactors.
The damage was done to the power station that powered the cooling
system for the reactors, a constant flow of cooling water, i.e.,
spent fuel pools. As the power station was installed underground, it
was flooded and power was lost, leading to the explosions of two
So it is very important to keep cooling
nuclear fuel rods. A dry cask is an alternative to a water
NRC web site presents a very informative
explanation of dry casks. The following picture is from their
site. A dry cask contains spent nuclear fuel
and keeps it until it cools and does not emit any radiation.
Dry cask (Source: NRC
Because the casks are portable, they
can be placed almost anywhere, subject to licensing and other
regulations. The NRC's page explains that because there are no
permanent spent fuel deposits in the US, spent fuel rods are stored
at each nuclear power plant site, regardless of its operational
status. They are running out of space for spent fuel, so it is
necessary to move it out of pools and somewhere else. Dry casks can
be freely moved. Here's a map of dry cask storage locations.
Sites of dry casks in the US (Source:
I was interested in the sites closest
to where I live. In addition to Diablo Canyon and San Onofre,
now-defunct Rancho Seco and Humboldt are listed. Probably Rancho Seco
is the closest to me. According to the NRC page, a dry cask has
effectively kept spent fuel contained for more than 20 years. My
understanding is that it takes tens of thousand years before all the
harmful radiation runs out. So the data that show its safety for the
last 20 years do not give me a very secure feeling.
Let's go back to Japan. Japan
reorganized an agency that was supposed to regulate nukes, as I
reported before. The new agency copied the US Nuclear
Regulatory Commission and created the Nuclear
Regulation Authority. Recently, Shunichi
Tanaka, the chairman, ordered power utilities companies to move spent
fuel rods to dry casks. Right now, construction, which started in
August 2010, is under way at a place a little north of where the
disastrous earthquake hit back in 2011. The figure below illustrates
the completed one, to be operational in October 2013. A reprocessing
factory is located nearby. It is positioned as a temporary storage
place, but it might end up being permanent.
Completed temporary spent fuel deposit
The US and Japan face the same problem
in operating and maintaining nukes. The only difference is that the
Three Mile Island accident was about 35 years ago and the one in
Japan was less than two years ago. I certainly hope no similar
accidents will happen in the US.
Spent nuclear fuel
Posted By Zen Kishimoto,
Thursday, October 11, 2012
Updated: Thursday, October 11, 2012
| Comments (0)
I read Elisabeth
on the Nuclear
Energy Insider website
with interest. The title of the article was USA:
are natural gas and liberalised energy markets challenging nuclear’s
This is my summary of her points:
Nuclear energy cannot compete in price with gas. The only element
that might make nuclear shine is its lack of GHG emissions. It is
still too early to dismiss any energy source at this time, because it
is hard to predict so far in the future.
article is well researched and interesting by itself, it is not earth
shattering; other media and researchers have reported similar
stories. But it was interesting enough to inspire me to write a blog
to compare the US and Japan in terms of their future energy mix. The
US is often compared with European and other countries like Japan,
and it is said that the US is behind the curve in many areas, like
education and sustainability. Because I understand what's going on in
both the US and Japan at the native level, an ironic grin comes to me
when I read such comparisons. It is so funny to see that people in
both countries blame their own country by saying how advanced the
other country is. If you read both sides of the story, you would
wonder which of the two is better than the other. You know far more
about your own country's problems than another’s.
(Well, Japan is
not mentioned much when the future energy mix is discussed, partly
because not enough information is published in ENGLISH. Wait. Even if
you read Japanese. I often get confused about what is really going
When we discuss
the future energy mix in the US, we talk as though we were facing a
unique problem with energy sources and were the only country
suffering so. Nuclear power is a wonder of energy and there is no
question about it. Until the Fukushima-Daiichi reactors accident, we
did not pay much attention to potential safety problems but enjoyed
the power the reactors produced. Although there are many angles to
nuclear power in the US, I think these are the main drawbacks:
construction and operating costs
nuclear waste disposal sites
Yes, safety is
also mentioned often, especially in surrounding communities and by
activists. But I do not see much discussion of it in the media now.
Don't get me wrong. I do not intend to marginalize the Three Mile
Island accident and the suffering it caused people. Construction cost
is increasing because of more regulatory pressure and more safety
feature checking procedures and oversight with explanations and
opinions of the people in surrounding communities. As a new nuclear
power plant needs to go through several phases, it may take as long
as ten years to complete construction. On top of that, there is no
guarantee the construction will ever reach the final stage, because
at each phase, more fixes and modifications may be ordered, with no
guarantee of passing each check.
In addition to
this, cheap gas, thanks to shale gas, is becoming a more and more
attractive alternative to other energy sources. Although gas is gas
and does not eliminate GHG emissions completely, as nuclear power
does, it is cheap and cleaner than oil or coal. Unless GHG emissions
control becomes very strict, this trend will continue.
The second element
is the lack of permanent disposal sites. Yucca
Mountain was to be the federal nuclear waste
deposit site, but no longer is. Diablo
Canyon and San
Onofre, two nuclear power plants in California,
are being operated with a special provision. California does not
allow the operation of nuclear reactors without permanent nuclear
waste deposit sites. The two are being operated as exceptions because
without them, a severe power shortage would become a reality,
especially in southern California. There was speculation about a
California-wide referendum to negate that exception in the upcoming
election. When I received an election packet, I looked for it but
could not find it. The referendum was not officially entered because
it missed the filing deadline.
Onofre is currently not in operation and will
not be restarted until 2013 at the earliest, according to
NBCDFW.com. It was feared that southern
California could face blackouts if the referendum passed. The power
supply seems to be fine without San Onofre for now. What if we stop
Diablo Canyon, too?
I’ve written a
lot about what's going on in Japan and do not want to repeat it here.
Those who are interested in what I said before can take a look at old
Japan Really Getting Out of Nukes?(January
Next with Japan's Nuclear Power?
Japan Restart Any of Its Nuclear Reactors?
09, 2012 )
on Japan's Nuclear Reactors
to Fight Peak Power Demand in Japan
Restarts Two Nuclear Reactors (May 31, 2012)
imports about 96% of its energy, found nuclear power to be suitable.
It does not emit GHG and its fuel can be recycled. Before March 11,
2011, Japan was one of the biggest proponents of controlling GHG
emissions and declared that it would cut them by 25%. But since the
disaster, GHG emissions are seldom discussed. These are the current
major points about nuclear power facing Japan:
availability without it
arguments against nuclear power in Japan are subsiding a little
compared with the year 2011, but they are still pretty loud and
powerful in public opinion. Those who oppose nuclear power claim that
power based on renewable energies, such as solar and wind, could
easily replace existing nuclear power overnight. But as in the US,
that may not happen for quite some time. If I talk to people in Japan
who are in business and technical industries like ICT, they say it is
not possible to get rid of nuclear power altogether without securing
an alternative energy source. It is interesting that their voice,
coming from a technical and operational understanding of energy, is
far less powerful than that of the anti-nuke crowds.
The current, very
unpopular administration flip-flopped its stance. It was initially
going to restart all of the stalled nukes, but after strong public
opinion it tried to change to a stance of shutting down all nukes by
2030. It then tried to make it official but changed its position
again to neutral after the business community's opposition and
speculated pressure (not confirmed, though) by the US for security
reasons. So it is not clear what the Japanese government’s position
is. The big difference between Japan and the US is that the US will
be fine without nukes because it has ample and cheap natural gas,
while Japan needs to import more energy without nukes. We cannot just
look at this as if it were a fire on the other side of the ocean,
Posted By Zen Kishimoto,
Monday, October 01, 2012
| Comments (0)
There has been a lot of discussion
about whether power generation by nuclear energy will stay in Japan’s
energy mix in the next 20 years. Immediately after the earthquake and
the tsunami disaster, antinuclear sentiment seemed unstoppable.
However, the pronuclear power camp, including some politicians,
utilities companies, and local governments that host nuclear power
plants, pushed back this trend a little bit. With that, the Japanese
government restarted two of the fifty reactors that had not been
restarted as usual after being stopped for their annual checkup.
However, without a formal process, these two of the nuclear reactors
were restarted in spite of a lot of opposition in July. This sparked
weekly demonstrations against nuclear energy everywhere, but the one
that attracted the most attention was the one in front of the prime
minister's office (similar to the White House).
As the current administration loses
support, it tries to regain popularity. It has reversed the old
policy of keeping nuclear power in the energy mix for 2030. If that
were all, it wouldn’t be a problem. However, the government just
gave the OK to restart construction of a plant that was put on hold
after the disaster. It will probably be another 10 years before this
plant will be available for power generation, but if nuclear power is
excluded from the energy mix, its life is only 10 years or so.
There are a lot of factors involved in
the exclusion of nuclear energy, including pressure from business
groups and the US and those who stand to gain a lot in continuing
nuclear energy. I think banning nuclear energy completely from the
mix is a mistake. What the Japanese government should do is to make
all the data and discussions open and make the decision process fair.
The government used to have two agencies under the same minister. One
was to promote the nuclear industry and the other was to control and
guarantee the safety of nuclear reactors. So it has decided to make
agency, known as the Nuclear
Regulation Authority, independent like the
Regulatory Commission in the US. Japan has a
long way to go before it finally can decide on the energy mix that is
right for it.
Nuclear regulatory commision