Generator Lab and Faraday’s Law

Last week’s class, we discussed power in many forms. We had a lecture about nuclear power, and then delved further into the topic of power itself. We like to think of power and energy as being “generated,” which may be an incorrect term for it. All energy is already in existence, it is our job to transmute certain kinds of energy into other, usable forms of energy. In this lab, we experimented with this concept by using two forms of energy: electromagnetic and kinetic energy.

The flashlight: note the electromagnetic interior

Using the NXT Lab Program, we attached our computer once again via USB cable to a device designed to measure our quantitative results, in this case, an electromagnetic flashlight. Using our own kinetic energy, we passed a magnet through an electromagnetic circuit that would in turn generate more energy from the interaction between the magnets. In this way, we could power the flashlight. By doing this, we saw how Faraday’s Law worked in real life, with quantitative results.

Discovered by Michael Faraday and Joseph Henry in 1831, Faraday’s Law describes the relation between the magnet’s speed in a closed circuit and the amount of energy produced. (Unfortunately for Mr. Henry, he didn’t publish his results before Faraday, otherwise we may be studying Henry’s Law!) The law is this:

 

The induced electromotive force (EMF) in any closed circuit is equal to the time rate of change of the magnetic flux through the circuit.

 

To see this illustrated, we used a set time interval and shook the magnet through the circuit a different number of times per interval to see what results we would get. Our set time interval was 30 seconds. For the first trial, we shook the magnet no times, then 10, then 20, then 30, and finally 40 times. The results of the lab were displayed on an Excel spreadsheet we created. Interestingly to us, our first trial, where we shook the magnet no times, still generated some energy, though we had expected none. The voltage was much lower than any of our other results, so perhaps this was the result of some stray energy stored within the flashlight. When we got our data, it was per second, so we squared the sum of the total voltage to determine how much actual energy we had generated. The results of the sums squared is as follows:

# of Shakes Sum Squared
0 0.95
10 32.6
20 81.7
30 44.5
40 143.9

 

Lastly, based on this information, we made a graph of the results and included a line of best fit to show the general trend in energy generated.



Interestingly, for our fourth trail, where we shook the magnet 30 times, the data did not follow the trend of increasing voltage with increasing shakes. This may be human error, or maybe a technical malfunction of the part of the NXT software, but we felt it was important to note this irregularity, as it did not comply with Faraday’s Law!

The Auto Industry Goes Green

Having been bailed out by the government, US car manufacturers ought to be more beholden now to their customers (and taxpayers) wants and needs more than ever before. However, this has not really been the case in recent times. While there has been a steady and increasing cry for energy-efficient vehicles, and even totally green, fossil fuel-less cars, the industry has been somewhat hesitant to put these cars on the market. When electric cars were first being put on the market in the early 2000s, it seemed as though the motor industry was taking its role in shaping the future of our planet seriously. Then, almost as suddenly as they appeared, the electric car was shelved by companies who pandered to government lobbyists and big oil’s almost invincible grip on the auto industry. I would recommend the documentary Who Killed the Electric Car? to anyone who is interested in finding out the sordid details – but trust me, you will be angry. http://www.whokilledtheelectriccar.com/

Available on Netflix, probably for free

In any case, the car companies proceeded to roll out SUV’s and other gas-guzzling vehicles over the decade, pushing gas-hungry cars onto consumers who were fed up with exorbitant and ever-increasing gasoline prices. Hopefully, people are realizing that gasoline-powered vehicles ought soon to become a thing of the past – due in no small part to the strain they cause on consumers’ wallets. In response to this new demand, there has been a sporadic but concentrated effort from Detroit and automotive companies around the globe the produce more fuel efficient cars. Legislation has passed in Congress that has increased the standard miles per gallon in cars, but this has not really affected the industry at all. While most companies are busy with and content with offering cars that have good mileage, others have put forth some more revolutionary ideas:

Most familiar to people is probably the hybrid car. When models like the Honda Civic Hybrid and the Toyota Prius began to sell in massive quantities, the rest of the auto industry was quick to hop on the hybrid bandwagon. In regular cars, the engine uses a lot of gas to move the pistons and create the energy to propel you forward, but hybrid cars have much smaller engines. Therefore, they cannot do as much work. A small hybrid engine runs on gasoline, and generates enough momentum to keep your car moving, but it needs extra help when speeding up, going uphill, etc. That help comes from the electric-powered battery. By using two sources of energy, hybrid cars have gotten national attention. The hybrid car reduces gasoline intake in your car by a lot and reduces your carbon footprint by relying on a simple principle – that people do not need their cars to be running at peak performance at all times. A typical engine is big enough and gas-hungry enough to supply energy when you need to floor the gas pedal – this is when your car is at peak performance. However, most drivers use peak performance in their cars far less than 1% of the time, and the other 99% of the time it is simply wasting energy. While the hybrid car reduces this efficiency problem, it still doesn’t eliminate the need to put fossil fuels in our cars.

Don't get too smug from your new hybrid purchase - it still runs on fossil fuels

According to Automotive Digest, there are a number of companies currently at work on fuel cell vehicles. Fuel cell vehicles, or FCVs, are nowhere to be found on the current market. That is because these vehicles have been mostly equipped with “batteries” that use natural gas and fuel such as hydrogen to make them run. While these vehicles could cut carbon emissions from cars by as much as 50%, and have 25% greater fuel efficiency than even hybrids on the market, the fact that America and the world is infrastructurally unprepared to deal with a slew of cars that run on fuel cells – the technology has simply not been researched enough and would need to become far more ubiquitous to truly make an impact. According to the magazine, these fuel cell vehicles will be pushed into the market by 2014 and are expected to reach sales of 670,000 per year by 2020.

Fuel cell engines vs. standard engines
Projected sales of fuel cell vehicles

Even with these options, carbon emissions will still exist, and both options still contribute to pollution and the destruction of the ozone.

Then there is the electric car. Fuel cell vehicles and hybrid cars are also technically “electric,” but this is not what the auto industry means when it talks about electric cars. These cars run wholly on rechargeable battery power, often coming from pricey lithium-ion battery packs. The price of such batteries has contributed to high prices for these cars and therefore low sales, but introductions like the new Nissan Leaf car and other smaller “mini” cars tailored for city life are becoming more and more popular. In addition to being totally gas-free, the electric car has plenty of other advantages – for one, it runs nearly silently when compared to traditional motors. The Obama administration has tried to apply pressure to the auto industry from the bailout to eliminate funding for fuel cell cars and focus more on purely electric cars, but this legislation was squashed by the House in 2010. instead, the Department of Energy is slowly cutting off funding for fuel cells (an idea toted by the Bush administration as the best viable alternative) and increasing funding for electric car technology. Hopefully, given all the trends of destruction and ignorance, we are finally arriving at a place where the companies that have taken taxpayer money are listening to the consumers who saved them: green cars are the way of the future, and are an integral part of the next chapter of the American auto industry if it wants to stay competitive in a global market.

The Nissan Leaf could be the future of the auto industry

Sources:

http://www.howstuffworks.com/hybrid-car.htm

http://www.automotivedigest.com/content/displayArticle.aspx?a=66581

http://www.howstuffworks.com/electric-car.htm

NXT Robotics Experiment #2: Isaac Newton’s Laws at Work

 

 

(No pun intended)

 

In our second experiment using the NXT Robotics program, we explored the concepts that make up the fundamentals of Isaac Newton’s Laws of Motion. As we saw in class, these laws have been proven time and again by scientists throughout the ages. Newton’s first law and his second are the ones we primarily focused on for our research in class, but for the sake of this blog and continuity, I will provide a list of all three:

Newton’s First Law of Motion:

I. Every object in a state of uniform motion tends to remain in that state of motion unless an external force is applied to it.

Newton’s Second Law of Motion:

II. The relationship between an object’s mass m, its acceleration a, and the applied force F is F = ma. Acceleration and force are vectors (as indicated by their symbols being displayed in slant bold font); in this law the direction of the force vector is the same as the direction of the acceleration vector.

Newton’s Third Law of Motion:

III. For every action there is an equal and opposite reaction.

 

Using only the battery and motor from the original Mindstorm robot, we connected these to a platform which had a raised pulley. We attached the motor to the string around the pulley, which had a weight attached to its opposite end hanging over the pulley. In this way, we constructed a machine that would utilize Newton’s laws. Mostly, we were using the machine to get quantitative data to support Newton’s second law. Although our group ran out of time to do the experiement completely, and we managed to achieve some questionable data through our own measuring errors, I believe that the data we did get can support Newton’s laws, although our data cannot be considered totally accurate.

In the first part of the experiment, we were required to see the effects when we kept the motor’s power level at a constant (75) and changed the mass attached to the pulley. In the second, we changed the power levels, but used a constant mass. In this way, we were able to see the effects, as shown in the following chart pulled from NXT and placed into an Excel spreadsheet:

Having collected our data, we used the wonders of Excel to create charts to show the results visually. (I can’t for the life of me figure out how to take my graphs from the Excel sheet and insert them into this post with correct formatting, so props to A. Bray for providing the following!)

From this, we determined that our data was probably not 100% accurate, or it would follow a trend more closely. Taking this into consideration, I still think our data proves Newton’s second law. The more power one has, the greater the acceleration of the object. The greater the mass at a constant power level, the more power is required, and of course, the higher the power level used, the higher the actual power (work/time). If we had had a little more time last class, we likely would have achieved better results and come up with more conclusive evidence.



You WILL obey Newton's Laws

http://csep10.phys.utk.edu/astr161/lect/history/newton3laws.html

Demand Response

We take energy for granted. When we switch on our lights when we get home from science class, we expect them to go on without delay. Even now, I am anticipating that my power will stay on long enough for me to finish this blog post. How does our power get all the way from the power plant to our home? This is a question that involves the process known as demand response.

Graph showing energy supply during hours - note the proposed shift from peak hours to off-peak hours

When you flip a switch in your house, you are creating demand for power. When you turn it off, that demand gone. Electric companies try to find out what the peak hours of energy consumption are, and increase their output to reflect this. In this way, energy companies overcompensate for times when they feel energy usage will be highest so as not to make customers wait for their electricity. It is our energy consumption habits that affect the policies of energy companies. In the demand response theory, it is assumed that it is up to the consumers to reduce their energy usage – in response, electric companies could reduce their baseload (that is, the minimum threshold of energy needed to supply consumers’ energy demand).

The theory goes on to state that the money the electric company could save by this simple effort on the part of consumers would be substantial. According to the California Public Utilities Commission, these savings could then be passed on to the consumers themselves, making energy more affordable for all – good news, as US energy consumption is projected to increase by up to 40% by 2030.

Rolling blackouts caused widespread power outages in California, costing millions of dollars

Currently, electric companies only use demand response for customers who use meters. This way, electric companies can track how much customers use at certain intervals during the day – particularly during peak hours. Unfortunately, it is only available for these meter-holders – mostly large industrial and commercial complexes that can afford to have meters installed. Many companies are in the process of installing meters in every home and small business that uses power, but this process is arduous and poses problems in and of itself. Regardless, dynamic pricing strategies and time-of-use rates would significantly alter American’s relationship to energy. In addition, smart grid technology is beginning to emerge that can better target where and when energy is needed, as well as switching energy from being a one-way system to being a two-way system, thereby becoming more efficient. These upgrades could prevent future power problems such as lack of energy and blackouts like the ones seen in California in 2000 and 2001. Demand response programs will help increase energy efficiency and reduce our total consumption. It starts from the ground up though, and consumers should agitate for more energy efficient practices – especially when the end result could mean more money in their pockets.

 

Sources:

http://science.howstuffworks.com/environmental/green-science/demand-response5.htm

http://www.cpuc.ca.gov/PUC/energy/Demand+Response/

http://en.wikipedia.org/wiki/Demand_response

NXT Robotics Activity

During our first two classes, we were split into teams and given the assignment of programming a Lego NXT robot. When first told about this assignment, I must admit I felt a little nervous. After all, I had never done anything that complex with Legos in my life, and was always totally bewildered when friends of mine would talk about computers and computer programming. I’m the first to admit that technologically speaking, I’m years behind – this blog is about as advanced as I’ve gone so far.

An artistic rendering of my relationship with technology

Looking over the assignment thoroughly, my group members (Anna V. and Angela B.) and I realized that the first step toward completion of the assignment would be a speedy completion of the robot’s construction. Not to be a braggart, but I feel that our group must have had more than a little experience building Legos as kids, (I sure did) for our robot was constructed in no time. We accidentally constructed him so fast, we were attaching light sensors and sound monitors to him before we realized we didn’t need to do that for this project.

Once constructed, it was time to deal with the bulk of the assignment: the NXT programming. Having never done anything like this before, I was amazed at how simple it was to make our robot (whom we affectionately dubbed “Rover”) move around via the LabView program. With the USB cable connected to the robot, we tried out a variety of different NXT paths for Rover. First, we made him move in a straight line, then changed the power level that his wheels moved at, causing him to move at different speeds. For fun, we made him sing a little robot tune as he sped along, and then we got creative. We added the option to have Rover stop moving when we pushed his orange button, and programmed him to run without being connected to the computer via USB. After disconnecting him, we got him to run at a variety of speeds.

NXT Computerized Robot "Brain"

We realized that each one of Rover’s big wheels was controlled by a different motor, and therefore had a different port on the LabView program. Port A controlled one wheel, and Port B controlled another. By changing one of the Ports to go the opposite direction as the other, Rover began to spin in a circle, but the assignment wasn’t to get him to spin in a circle, it was to have him travel in circle with a diameter larger than two feet. To do this, we determined that we would need both wheels turning in the same direction, but at different power levels. After a few trials, we concluded that it was not what speed the wheels were at that determined the diameter of the circle so much as the difference between the power levels that was important in regards to diameter. With this in mind, we successfully got Rover to complete a circle with a diameter of two feet, all while still singing his song (a beautiful melody that sounded like a cell phone ringtone from 2000).

For the last part of the experiment, we needed to determine how accurate our measurements were versus the measurement provided by the LabView program. We used a ruler to determine how far Rover traveled at various speeds over a one second period, then used LabView to see what the program determined was the distance he traveled. For this, we needed to calculate Fractional Error, as shown by this equation:

Fractional Error = d(ruler) – d(program) / [ d(ruler) + d (program) / 2 ]

where d(ruler) is the distance measured by us, d(program) is the distance measured by LabView, and both are in meters. Here is a table of our results:

Power Level Ruler Distance (m) LabView Distance (m) Calculated Fractional Error (m)
25 0.06 0.0561 0.065
55 0.16 0.1615 -0.009
75 0.23 0.25 -0.083
80 0.3 0.26 0.142

 

After analyzing this data, we realized that our fractional error increased as the power level increased. This did not surprise us, as it became harder to measure Rover’s distance the faster he went. At low speeds, he stayed more or less in a perfectly straight line, while at higher speeds he began to veer to the right and left. We determined that the higher his speed, the farther he would travel in a given amount of time, but less accurate our measurements were.

We finally deconstructed Rover, and though he may be gone, the lessons we learned from him will never be forgotten.

RIP Rover - Gone but not forgotten, a robot who could do 360º while whistling Dixie.

FUKUSHIMA DAIICHI and Japan’s Nuclear History

Japan’s nuclear history is long and sordid. This tiny island nation in the West Pacific has seen a terrible radioactive past. Nuclear power has played such a part in its history that it has become somewhat etched into the cultural memory (for example, Godzilla’s creation story has to do with his awakening from a nuclear blast to realize he had atomic power). With last year’s tsunami and ensuing nuclear plant disaster, the country has entered a new phase with its relationship to nuclear power.

Godzilla has atomic breath

On the morning of August 6th, 1945, the world awoke to a horrible new reality. That morning, Americans awoke to the news that the Japanese city of Hiroshima had been hit with a tremendous new weapon: the atomic bomb. Three days later, the port city of Nagasaki was also bombed. Shortly thereafter, Imperial Japan surrendered unconditionally to the United States and the Allied Forces, thus ending major hostilities in World War II.

Years passed, and survivors of both atomic bomb blasts were left with bizarre permanent scars as well as inexplicable diseases previously unknown to science that were determined to be the result of tremendous levels of radiation poisoning. This terrible tragedy for Japan was even worse for survivors who had to face government incompetency as well as discrimination from those who feared them, for no one truly understood the radioactive nature of nuclear power. Indeed, the US refused to elaborate to Japanese officials even years after the war was over on the effects of radiation because the atomic bombs had been labeled as classified information. This bumbling incompetency at the hands of officials was one of the reasons that Terumi Tanaka and others formed the Japan Confederation of A- and H-Bomb Survivors Organization. According to a quote from the Mainichi Daily News, Tanaka and man others of his generation (including thousands of atomic bomb survivors) wanted to not “just be victimized by nuclear energy. [We] wanted to use it positively.”

Aerial photograph of damage from Hiroshima atomic explosion

Flash forward to March 11, 2011 – a category 9 earthquake hit just off the east coast of Japan. The worst was yet to come, as the earthquake caused the greatest tsunami Japan has seen in ages. It ravished the east coast of Japan, causing massive structural damage, tremendous loss of life, and untold suffering. As the country was reeling to get back on its feet, another disaster loomed. In the Fukushima prefecture, south of the main tsunami-hit Sendai area, the nuclear power plant plant run by the Tokyo Electric Power Company (TEPCO) began to experience trouble. Three of the six reactors at Fukushima Daiichi were “tripped” by the earthquake, causing them to stop the fission process inside – the other three reactors were fortunately shut off for inspection. As core temperatures began to rise more and more (as the fission process continues to produce heat energy even after the process has been stopped) authorities scrambled to find ways to prevent a total collapse. The shut-off reactors were flooded with water, cooling the reactors via a diesel generator that pumped water around the reactors to cool them. However, the tsunami sloshed tons of flood water into the generators, causing them to fail and the plant to become even more unstable. By nightfall, the government had declared at state of emergency at Fukushima, conceding that they were “bracing for the worst.” Many were alarmed and recalled the Chernobyl disaster in Soviet Ukraine in 1986.

Explosion rocks Daiichi complex on March 14, 2011
Location of Japan's nuclear power structures in relation to earthquake epicenter

The next day, authorities discovered that radiation around the plant was nearly eight times the normal level, and blamed this on discharge from a melted cesium fuel rod. Later, an explosion from inside the plant destroyed the containment structure built around one of the reactors and severely damaged the plant. As many as 200,000 people were ordered to evacuate the immediate surroundings. Two more explosions happened Monday, March 14, even as the Japanese government told its people and the world that the safety of the plant had not been compromised. These explosions made the government admit the possibility of total meltdown at the plant – a situation that would almost certainly rival Chernobyl. Radiation was found in citizens evacuated from the area, and on board the USS Ronald Reagan which happened to be at sea nearby. By the end of the day, only 50 people remained at Fukushima Daiichi, as the rest were evacuated due to extremely dangerous levels of radiation.

Seawater being used to cool reactors at Fukushima Daiichi

Fires sprung up in two of the reactors over the course of the next two days, but the plan to cool the reactors with seawater seemed to halt the process of ongoing meltdown. Over the next few weeks, radioactivity beyond normal, safe levels was found in Japan’s beef, vegetables, and milk. The 50 plant employees were exposed to tremendously dangerous levels of toxic and radioactive materials, and their health has likely been severely compromised for their bravery.  The plant was finally stabilized after months of intensive work in September, 2011.

This disaster has sparked a new conversation in Japan about the safety and efficacy of nuclear power as a viable alternative to traditional energy sources and green energy. Terumi Tanaka’s A- and H-Bomb Survivors’ Organization sponsored a conference for all Japanese citizens about nuclear power and its drawbacks. Japan’s government was roundly criticized by its people for its slow and somewhat muddied efforts to control information and limit access to the plant, but did learn from past mistakes by immediately granting all who may have been exposed to radiation government-funded access to regular healthcare checkups throughout their lives – something that Bomb survivors were not treated to. The government has also been criticized for downplaying the dangers – a Norwegian study released in October 2011 found radiation levels were almost twice what was reported, and the nuclear disaster was initially graded as a 4 by the Japanese government. It was only upgraded to a 7 (the same level that Chernobyl received) after months had passed. With new reports about radiation levels rising in the oceans surrounding Japan, the full effect of the disaster is still unknown, even with nearly a year passed since.

Japanese protester at anti-nuclear power demonstration last year

 

References:

http://mdn.mainichi.jp/perspectives/news/20120114p2a00m0na001000c.html

http://www.telegraph.co.uk/news/worldnews/asia/japan/8863968/Fukushima-nuclear-disaster-timeline.html

http://www.bbc.co.uk/news/science-environment-12722719