Image from our outside of class-run of the experiment

For too many people, the concept of global warming is a nebulous thing. It cannot be directly heard, seen, tasted or smelled, and can only be felt when one considers the difference between several distantly separated years. Thus, if an experiment or diorama could theoretically place the effects and evidence of global warming within the realm of human sensation, it may do much to further society’s comprehension of the principles of climate change. It was with this purpose in mind that we set out to design our final lab experiment.

Our first task was to determine precisely which elements of global climate change we sought to put on display in our lab. Once it was determined that capturing/measuring fluorocarbons and estimating vehicular emissions were impractical for this situation, we began to think in terms of the larger process, rather than microscopic details. This allowed for our breakthrough, which was the decision to show the heating of the Earth with an enhanced atmosphere relative to one with little or no atmosphere. After some careful deliberation and research over how to best exemplify this, we arrived at the model we sought to test; it had two cups of water acting as “planets”, a Ziploc bag representing a polluted atmosphere, and a stage light or work light providing the heat normally emitted by the sun.

The procedure for our lab was simple, yet lengthy in time required for the full results to become available. The first step was to fill both cups with an equal amount of water (while this latter detail may seem modest, it is actually quite crucial, as vastly different amounts of liquid would warm at incongruent rates). One cup was then carefully placed in a Ziploc bag and sealed, thus becoming “polluted earth”; in our lab’s control variable, the second cup would be left outside a bag to represent a healthier climate. Both cups were then to be placed under the direct glare of a shop light or studio light, so as to simulate the effect of the sun radiating upon the Earth. Every thirty minutes for two hours, the temperature of each cup was to be recorded.

Two runs of our experiment were done, with the results below. We completed test One outside of the classroom, whereas Test Two was done in the classroom.

 

Test One:

Sealed Cup

Time (Minutes) 30 60 90 120
Temperature (Degrees F) 67 84 113 117

Open Cup

Time (Minutes) 30 60 90 120
Temperature (Degrees F) 65 82 99 98

 

Test Two:

Sealed Cup

Time (Minutes) 30 60 90 120
Temperature (Degrees F) 145 151 N/A N/A

Open Cup

Time (Minutes) 30 60 90 120
Temperature (Degrees F) 122 138 N/A N/A

graph

A view from the in-class run of our experiment

The most notable feature of the tables above is that Test Two contains less data than Test One, despite reaching greater temperatures than either sample in Test One. This is due to time constraints within the classroom; in order to have the experiment produce some valid data within a shorter period of time, the cups were placed far closer to the light source. So while only two readings were available, it gave us results that correspond perfectly to our unfettered run outside of the classroom.

As the data indicates, the temperature trends for the “planets” with “enhanced or polluted atmospheres” increased at a far greater pace than for those without such added layers. Why did this happen, scientifically speaking? The underlying cause in our experiment is the same for that of the real Earth. The atmosphere surrounding the planet does more than simply block out harmful ultraviolet rays from the sun, it also acts as an insulator, trapping radiation from being reflected off the Earth’s surface and back into interplanetary space. As the density of the atmosphere increases, so to does the amount of light withheld, and in turn the temperature or climate.

While our experiment may have been a simple diorama, it is critical that people begin to more fully comprehend the principles the experiment represents. Climate change, as little as it may be felt at any given moment, is a real and testable phenomenon. Should it go too much further, ecosystems across the planet would be negatively affected, as will geography once the polar ice caps have fully melted. It would take centuries to repair the damage already wrought by humanity’s excessive use of carbon and carbon equivalents. But we must start somewhere, mustn’t we?

Capturing a Reading

 

Posted in Uncategorized | Leave a comment

Tom Vale blog

“The presentation put on yesterday by Mr. Tom Vale was both shocking and positively electrifying!” Despite such poor, but arguably necessary puns, Mr. Vale’s lecture was incredibly fun and informative. Moving quickly but satisfyingly through several different displays, Mr. Vale spoke to the class about several basic machines that continue to impact the world around us today. Perhaps most impressively, Mr. Vale knew not only the scientific implementations of each device, but also its historical background. Overall, Mr. Vale impressed with his wit, and educated with his expansive breadth of knowledge.
The first item put on display was known as a Peltier device, invented in 1834 as a representation of the principle of thermoelectric cooling. The device is very inefficient, but continues to find its way into applications to this day. It consists of two cups of liquid, one hot and one cold; the heat energy is moved from one side to the other, with the corresponding electrical energy captured by a mini-generator of sorts. Again, a Peltier device only converts about 5-10% of energy, so it’s a woefully inefficient tool.
Of far greater efficiency is the Sterling engine, which was actually invented eighteen years earlier than the Peltier device, in 1816. Developed as an alternative to the steam engine, it has been found in certain submarines in which quiet engines have been sought. Based upon the principle of temperature differential within air, it has also been known as a “hot air machine”. Unlike the Peltier device, the Sterling Engine can achieve up to 80% efficiency in converting energy.
Up next in Mr. Vale’s presentation were a couple of small, yet no less unique gadgets. The first was a piezoelectric barbecue lighter, which is notable for its use of a small amount of quartz to create a spark when struck. Mr. Vale followed up this nifty tool with the Mendecino motor, a relatively new invention when compared with all else on display yesterday, as it was invented only roughly twenty years ago. Named after the area in which it was developed, the Mendecino motor uses electromagnetism to float and spin. Impressively, it can achieve spin rates up to 1500 RPM under select conditions. Unfortunately, no practical use exists for the Mendecino motor outside of the classroom; its only purpose its to serve as an educational aid.
While each of the above devices was extremely educational and twice as fascinating, the true star of Mr. Vale’s lecture was Nicola Tesla and his famous electric coil. During his lifetime, Mr. Telsa was granted over seven hundred United States patents and developed the theory of alternating current electricity distribution (that which powers energy grids to this day), but it was the incredible power of his coil that was on display at Suffolk yesterday. This model was constructed Mr. Vale himself, using nothing more than a generator, an empty plastic bucket and coated copper wire, but packed quite the visual punch. Placed before our eyes was Tesla’s wireless transmission of electricity, as the coil emitted a low-grade field around it. In fact, as Mr. Vale explained, the level of electricity surging through to power a variety of lights and lamps was so low that a special “skin effect” was occurring; the electricity failed to shock him because it traveled along his skin, rather than through it.
As it should by now sound, the presentation put on by Tom Vale was absolutely marvelous. This author had never dreamed of some of even the most basic devices placed before the class, and found himself amazed that so much could be done with so few resources. From the Peltier device to the Tesla coil, never were more than a few parts involved in creating great stores of energy. Truly an educational experience which will not be soon forgotten.

Mr. Vale testing his Tesla Coil

Mr. Vale lecturing

Posted in Uncategorized | Leave a comment

Searching for Green Homes

In the fight for greater energy efficiency, much of the attention has been focused on making automobiles more economical. The importance of the developments made by that concentration cannot be understated, as fuel-sipping cars such as the Prius and Volt begin to flex their muscles on the market, but undue negligence has been paid to another major source of carbon emissions: the American home. Until just recently, this contributor to climate change was left blissfully unaltered in the move to a more carbon conscious society. However, thanks to advancements in technology, increased emphasis and additional government backing, Americans are finally beginning to “clean up” their homes.
A tremendous amount of the energy consumed by homes and other building goes toward the control of its private climate—the heating or cooling of the interior rooms. According to Energy Star, a joint program of the United States Environmental Protection Agency and Department of Energy, up to fifty percent of energy consumed is used in this manner. Energy Star also recommends several ways in which the efficiency of the home can be improved, ranging from scheduling regular HVAC tune-ups and sealing heating/cooling ducts to installing advanced thermostats that allow the homeowner to adjust settings remotely. Taking any combination of these steps, or even just one, will help the homeowner not only save money, but likewise reduce their carbon footprint.
Whereas Energy Star helps the average American find practicable ways in which to minimize his emissions, the EPA provides more accessible data on the relationship between homes and the environment. In an attempt to force Americans to think macroscopically, they include the important connection between the home and the larger power grid, stating in their piece about the dangers of unfettered electricity consumption, “different power plants use different types of fuel, and a power plant that runs on coal emits more greenhouse gases per unit of electricity than a power plant that uses natural gas.” Thus it is important to turn out unused lights not just to reduce personal pollution, but also that involved in the production of electricity. Another source of household-based emissions most likely don’t consider are the wastes of the home; but we should all think twice about the garbage we generate, as the average American contributes 1,060 pounds of CO2 equivalent gasses from trash creation and destruction alone.
In tough economics times such as those from which the United States is just now emerging, paying for the technology to make their homes more efficient is not a realistic proposition. However, the Obama administration has made this a top priority, and worked with Congress to create several tax incentives for people to go green. The American Recovery and Reinvestment Act encouraged Americans to make “improvements such as adding insulation, energy efficient exterior windows and energy-efficient heating and air conditioning systems.” Similar credits were created for the purchase of electric plug-in vehicles and the conversion kits required to prepare homes for such cars. The Act also had language providing like benefits for the installation of private solar water heaters, geothermal heat pumps and wind turbines.
The biggest problem of the last decade was the sole emphasis by the ecological community on improving the efficiency of vehicles. While this does much to reduce one’s carbon footprint, much more could be done if each person were likewise tasked with making their home more eco-friendly. With the government now behind such projects full bore, thanks in large part to the efforts of the Obama administration, there is new cause for hope. If due attention is paid to the work being done by Energy Star, the EPA and the American Recovery and Reinvestment Act, perhaps homes of the future will all be “green”.

Works Cited:

http://www.energystar.gov/index.cfm?c=heat_cool.pr_hvac

http://www.epa.gov/climatechange/emissions/ind_home.html

http://bit.ly/GOb3Uk

Posted in Uncategorized | 1 Comment

Climate Change Skepticism

When considering theories within modern science, it would be wise to recall Isaac Newton’s Third Law of Physics, that for every action there exists an equal and opposite reaction. Such was seen with the popularization of Darwin’s theory of evolution during the early twentieth century, and it is being seen again in the contemporary debate concerning global warming and climate change; throughout the world, there are scientists which believe firmly in their occurrence and danger, as well as others who feel that there is little cause for alarm. In the former camp are those who see the melting of the polar ice caps and increase in average global temperature as signs of man’s artificial impact on the environment. The latter group see such change as cyclical in nature and thus of no concern to man. To date, much of the national discourse has been controlled by those extolling man’s harm on the environment, but for such a lengthy and vigorous debate to occur as has been seen on this issue, the dissenting voices have had to provide an equally compelling narrative.
Detractors of climate change theory come from all walks of life and fill career posts across the social spectrum, but those heard most loudly are conservative teachers, politicians and business leaders. In fact, is often this last group, those heavily invested in the industries under attack as mass polluters, which contribute the most language to the belief that humans have not caused the change in global climate. Often coming from companies which benefit from the extraction and use of fossil fuels, they contend there are natural explanations for the changes being seen in the environment.
But what, exactly, are these opposing narratives to explain climate change? The reasons vary, but share the common feature of looking “at the small pieces of the puzzle, while neglecting the full picture,” according to Skepticalscience.com, a website dedicated to rebutting just such claims. The most frequent assertion by this crowd is that climate has changed before, with periods of prolonged heating and cooling. Alternatively, it is also suggested that an increase in the number of sunspots over the last three decades is to blame. Attacks on the methodology of scientists are also popular, as claims have been made that the models connecting climate change with CO2 increases are unreliable, as are temperature records.
Division amongst the non-scientific public exists about the very occurrence of global warming and climate change. However, even amongst those who agree that something is happening within the environment, there is no consensus about the cause, or thus the remedy. To those employed in fields heavily funded by the presumed leading perpetrator, fossil fuels, a variety of natural causes exist to explain away any negative impact humans have had on the earth. As they continue to stall legislation and greenhouse gas-reducing projects in private industry, only time will what effect will be had on the world around us.

Sources Used:

http://www.skepticalscience.com/
–A once-stop shop for everything needed to explain both sides of the climate change debate. Provides not only rebuttals to , but also the actual arguments of climate change skeptics.

http://www.quadrant.org.au/blogs/doomed-planet/2009/07/resisting-climate-hysteria
–Article explaining that climate change is a cyclical event, and thus outside of the realm of human cause or control.

http://www.latimes.com/news/nationworld/nation/la-na-climate-change-school-20120116,0,2808837.story
–Story from the Los Angeles Times about the introduction of climate change skepticism into curricula across the nation.

Posted in Uncategorized | Leave a comment

Indian Point Controversy

One of the greatest tragedies of the 2011 calendar year was undoubtedly the Fukushima-Daiichi Nuclear disaster in Japan. Following an earthquake and the resultant tsunami in Japan, the facilities were overwhelmed, melted down and threatened the lives of thousands, if not millions on the densely populated island. Thousands of miles away, the United States Nuclear Regulatory Commission took heed of this warning when attempting to determine the propriety of relicensing Indian Point power plant just northwest of New York City. While the process has been ongoing since midway through the last decade, it has been significantly altered by a renewed skepticism toward nuclear energy brought on by the events at the Fukushima-Daiichi.
Comprising the greatest portion of the debate is the environmental benefits inherent to the continued operation of the Indian Point plant. Nuclear energy harvests power from the splitting of atoms, so it naturally contributes zero CO2 –equivalent particles to the atmosphere. As Norris McDonald, President of the African American Environmentalist Association noted in a New York Times on the matter, “If you shut it down, whatever you’re going to replace it with is going to increase emissions in communities like Harlem.” The leading candidate to replace the 2,000 megawatts shipped to New York City by Indian Point would be a natural gas plant, which while considered by environmentalists to be a cleaner form of energy, still falls short of the benchmark set by nuclear energy.
However, to proponents of Indian Point’s closure, this is all misinformation that distorts the true environmental impact felt by the plant’s operation. Plant operators will openly admit their inability to locate and repair a leak to one of the plant’s cooling tanks, though they argue that such is irrelevant as the water never contaminated water never leaves the compound. Concerns have also been raised about the manner in which Indian Point takes in water, a process that allegedly kills thousands of aquatic organisms annually, including at least one endangered species; further aggravating environmentalists, plant operators have offered to install screens at the points of water collection, but have refused to create more ecologically-friendly cooling towers.
The safety of local wildlife is not the sole danger being discussed, either. Also on the table is the ability of the plant to alert nearby residents of peril, a major factor upon which the NRC may deny them licensure renewal if not adequately proven. To meet standards, Indian Point must have sirens that can be heard up to a distance of ten miles from the site. Going back to the aforementioned fears of a potential leak, some lobbyists are also campaigning against the plant on grounds that it threatens the drinking-water supply of nearly eleven million people. Finally, while plant operators have said the compound and individual buildings are sturdy enough to withstand the impact of a jetliner, the plant’s proximity to New York City has raised eyebrows about its status as a possible terrorist target.
To date, the cards seem stacked against the continued licensure and operation of Indian Point Nuclear Power Plant. Former New York Attorney General and current Governor Andrew Cuomo has long stood against the plant being relicensed by the NRC, and a number of environmental groups likewise oppose the federal government permitting the plant. But before rushing to such a hasty conclusion, perhaps it would be wise for those powerful groups to consider the alternatives, or lack thereof.

Sources:

http://www.nytimes.com/2012/01/13/nyregion/vision-for-cheap-power-even-if-indian-point-nuclear-plant-is-closed.html
–Excellent New York Times article breaking down the debate.

http://www.riverkeeper.org/campaigns/stop-polluters/indian-point/radioactive-waste/
–A primary source, a page from the website of one of the main activist groups opposing the relicensing of Indian Point.

http://articles.cnn.com/2011-03-24/living/nuclear.plant.visit_1_nuclear-fuel-nuclear-power-plant-uranium-pellets?_s=PM:LIVING
–An article from CNN in which the author goes deep inside the Indian Point power plant to determine exactly what the entire debate is all about.

http://environmentalheadlines.com/ct/2012/02/02/indian-point-nuclear-power-plant-threatens-drinking-water-for-more-than-11-million-people/
–Another site actively campaigning against Indian Point, this one focuses on the plant’s impact on drinking water in the post-Fukushima age.

Green Peace commercial advertising the danger of nuclear power

Posted in Uncategorized | Leave a comment

MIT blog

Considering the class has yet to hear from Mr. Vales, but has already visited the labs at the Massachusetts Institute of Technology, it is going to be assumed that the assignments have gotten a bit shuffled of late. As such, this blog will be dedicated to our tremendous visit to the latter locale and all the lessons taught during our brief time there. Before going further, it must be mentioned how grand and stately the MIT campus feels; just walking from the Kendall Red Line station to our designated classroom, one becomes eminently aware of the history made and being made all around.
Once inside and protected from the dreary weather of the day, our hospitable hosts and lecturers instantly welcomed us and began their presentation. The topics being discussed were plasma, fusion processes and the potential they hold as a future energy supply. The first hurdle cleared by the class and our lecturer was the establishment of exactly what plasma is as a substance; most readily describe as the “fourth state of matter,” it is the result of superheating select gas samples. Just as solids give way to liquids and liquids to gasses as the energy is added to the particles, so to do gasses progress to plasma with sufficient heating.
This was certainly an educational lesson of itself, but just learning about plasma did little to connect our visit with the readings and curriculum studied thus far this semester by the class. Bringing everything together was the prospective use of plasma fusion as an alternative clean energy source. With fossil fuel supplies rapidly decreasing and the poor conversion rates of solar and wind power, MIT and other labs have high hopes that someday homes across America will be operate on the same energy supply as most of the stars in the universe. The technology has been in development simultaneously throughout the industrialized world for upwards of five decades, so it has been slow in coming, but a breakthrough is possible at any moment.
Without attempting to get into details that eclipse the understanding of this author, fusion power would operate similar to other sources, at least in principle. Just as with coal, the primary objective is to generate enough heat to transform water into steam and use that power turbines and create electricity. However, as simple as this process may sound empirically, in practice several insurmountable obstacles have been encountered, including how to manage the extreme heat of plasma as well as its electromagnetic tendencies.
While the entire afternoon was captivating, perhaps the most entertaining portion of the trip was the tour around the backrooms of MIT’s C-Mod facility. There we saw an operating control room replete with at-work scientists and a graveyard for discontinued projects. Included in this latter ilk were a couple of centrifuges estimated to cost approximately $300,000 a piece. Unfortunately, the most sobering message of the day was that budgetary constraints may force the closure of the entire C-Mod plasma fusion operation. As the scientists believe themselves to be close to finally solving the riddle of plasma energy, it would be a shame to see the Boston market lose a potential claim to fame.

Posted in Uncategorized | Leave a comment

Obama’s Solyndra folly

Political graft is nearly as old as the institutions of American graft itself. While thought by historians to have peaked around 1900 with the blatant operation of political machines by such pit bosses as New Yorker William M. Tweed, charges of cronyism have survived the ensuing decades; indeed, it is difficult to recall a single administration which did not face accusations of performing not in the name of the commonwealth of the nation, but rather a select group of private beneficiaries. Such is the controversial core of President Barack Obama’s current attachment to solar technology firm Solyndra. According to his Republican challengers and others, Obama had improper relationship with Solyndra that cost in excess of one-half billion dollars.
Presidents and politicians of lower castes have found various means of rewarding their benefactors, many of which have been fully legal operations codified by local, state or federal law. This is precisely how Obama allegedly pumped cash into Solyndra; the solar company had previously applied under the administration of George W. Bush (Obama’s most immediate predecessor) for a federal loan from the Department of Energy, an act made possible by the 2005 Energy Policy Act. Up to this point, all that had occurred was by the letter of the law. The situation became murkier when, despite initial DOE rejection and concerns about Solyndra’s liquidity, the Obama administration conditionally approved the loan in early 2009. Just two short years later, a restructuring of the loan had failed, and Solyndra filed for Chapter 11 Bankruptcy.
Given the size of the loan involved (approximately $535 milllion) and the nebulous nature of the process which led the Obama’s administration to approve the previously denied application, questions abound concerning President Obama’s connection to the company and exactly what happened to so quickly reduce Solyndra’s capital. Though some memorandums have been redacted and released, they have done little to illuminate this situation. Independent investigations by the Treasury and Federal Bureau of Investigation are ongoing, but an examination by the Washington Post has already concluded that this was not an apolitical loan, and that multiple warnings about Solyndra’s solvency were repudiated in order to grant the loan.
Should this massive misstep by the Obama administration spell the end of federal grants for solar technology, it is difficult to imagine the field growing at the rate previously projected. Few loans are of the size of that received by Solyndra, but even smaller firms that receive only a few million dollars for research and development are crucial to technological advancement. As the country emerges from the Recession of 2008, private investors have become more skeptical and are thus less willing to lend large sums; in order for progress to continue unabated, it is necessary for the government to sponsor some projects. Further, at least one source examining the Solyndra situation asserts that the company’s financial woes are emblematic of a larger pattern in the field of solar technology.
There does seem to be some impropriety in the actions of the Department of Energy and President Barack Obama concerning their $535 million loan to the solar technology company Solyndra. One major newspaper has already indicted the Administration of misconduct, but the proverbial “jury” is out until the federal investigators return their results.

Obama Aides and Solyndra

Sources:

http://www.washingtonpost.com/blogs/ezra-klein/post/five-myths-about-the-solyndra-collapse/2011/09/14/gIQAfkyvRK_blog.html
–Five Myths about the Solyndra collapse from the Washington Post

http://www.solyndra.com/
–Much of my information about Solyndra itself as a corporate entity was gained directly from their website, so it’d be prudent to list them as a source.

http://www.washingtonpost.com/solyndra-politics-infused-obama-energy-programs/2011/12/14/gIQA4HllHP_story.html
–Another article from the Washington Post, a very reputable paper and one of the foremost investigators of the Solyndra scandal. Their accounts of the complex situation was invaluable in creating this blog post.

Posted in Uncategorized | 1 Comment

Solar Lab Post

Experiment in action

Attempting to generate electricity by capturing light (camera flash was off)

For this eco-friendly author, one technology that has always seemed feasible for greater use has been that of solar powered devices. But when the actual practicality of the technology is examined, does reality meet my expectations? This is the question that initially piqued my curiosity when it was announced that the latest in class experiment would involve solar cells. The scope of the investigation may have been downgraded a bit from that which was conjured up by my illustrious imagination, but the information collected would serve to increase my knowledge about the feasibility of solar technology nonetheless.
Kept within the confines of our “laboratory”/classroom, my partner and I were given the task of determining if the color of light had any discernable impact upon its absorption a solar cell; in other words, as all colors are a result of variance in light wavelengths, how does a solar cell respond to the differing wavelengths. Aiding our inquisition were several tools: a small flashlight, a small solar cell, the chassis-computer of the robots from previous experiments, translucent colored tiles of various hues, and a piece of scientific software called Labview.
The experiment itself was conducted in several stages, in order to provide the greatest possible wealth of data. As a control, our first trial was run with the cell left undisturbed, absorbing only as much light as would be provided by the fluorescent bulbs lining the ceiling. Subsequent tests of the solar cell without a tile were run by holding the flashlight to it at the increasing distances of 5 cm, 15 cm, 22.5 cm and 30 cm. We also ran a test of the cell covered by our hands, to see how it operated in pitch black conditions. Below is a graph of our results from this no-filter experiment; please note how the amount of light absorbed forms an inverse relationship with the distance at which the flashlight was held (the amount of light absorbed was recorded in the number of volts produced by the cell:

However, this hardly satiated our curiosity. Quickly after seeing these results, my partner and I began investigating how the colored tiles affected the voltage-production of the solar cell. With a flashlight held at the fixed distance of 15 cm, we sought to see the differences in light absorption as the color of the tile used alternated. As the following bar graph indicates, while discrepancies did arise in our findings, none were especially significant. Preforming best (defined here as generating the greatest voltage) was the orange filter at 0.389 watts; the ignominy of least voltage generation belonged to was the blue filter at 0.309 watts, though it should be noted that the red filter was not at all far behind at .315 watts.

Bearing in mind these many results, several conclusions may be drawn about solar cell technology. Primarily, our experiment with unfiltered distance proves that proximity to the light source is a critical component in the generation of electricity. The data also points to the necessity of utilizing unfiltered sunlight, as the electrical output of our tests run at 30 cm without a filter surpassed those conducted at 15 cm with any color filter. In sum, the stronger the light to strike the panel, the more electricity that shall be generated.

Posted in Uncategorized | Leave a comment

Hydraulic Fracturing blog

As the population of the United States (and, in a larger scope, the world) continues to grow, so does its insatiable need for energy. While energy comes in a variety of forms, each most suited for a different purpose, the one seeing perhaps the greatest increase demand of late has been natural gas. This surge may be partially attributed to the aforementioned increase in population, but there are other factors likewise causing this uptick in demand. Amongst these is the environmental lobby; natural gas is a more efficient source of energy than coal, as “40% of the energy in gas is turned into electricity, but only 33% for coal” (Richter 84). While this may not seem like a tremendous leap initially, extrapolating those figures to serve 300,000,000 people will certainly add emphasis to even just that seven percent increase in efficiency.

Unfortunately, as with its dirtier energy cousin oil, many of the easily accessible sources of natural gas have been already been tapped and drained by previous generations. With “necessity being the mother of all invention,” though, and demand continually growing, gas companies have developed an innovative way of retrieving their supply from previously inaccessible sources. Known as hydraulic fracturing, or more colloquially “hydrofracking,” this process uses explosives, sand and highly pressurized chemical liquids to penetrate hard rock beds. Once these beds have been fractured, wells can be established to pump out the gas and direct it to a plant for processing.

Economic incentives abound for pushing ahead with hydrofracking. For gas companies, a large initial investment in each well will provide them with a steady supply of gas; for consumers, the increased supply will drive down prices (assuming demand fails to outpace well development). Along the east coast of the United States is the current primary target of hydrofracking, the Marcellus Shale. Promises of lease money to farmers who sacrifice portions of their land to gas companies for this purpose are plenty, as are the prospects of massive job creation in the region. For each of these reasons, gas companies have already begun to plan for the creation of over 1,000 wells in the state of Pennsylvania alone, with hopes for continuing to progress all along the 600 mile stretch of the Marcellus Shale. Doing so would entail the involvement of land throughout the so-called “Rust Belt” of the United States, passing through Ohio, Indiana, Pennsylvania and New York.

However, despite such seemingly advantageous conditions for hydrofracking, there hardly exists a unanimous consent to the implementation of such wells. Just as economic incentives abound, so too do ecological and environmental concerns. Primary amongst these are impact hydrofracking has on nearby water supplies; there have been enough reports of contaminants in local water tables to cause the Natural Resource Defense Council, a group dedicated to protection of the environment, to call for “Putting the most sensitive lands, including critical watersheds, completely off limits to fracking.” Such danger has also attracted the attention of more than a few politicians, as affected water supplies threaten not only the environment, but constituents as well. Imprecision in the “drilling” that has led to blow outs, leaks and industrial accidents have likewise led some to oppose further hydrofracking.

If hydraulic fracturing is given full implantation across the southeastern and central United States, the price of natural gas is almost certain to drop, as are unemployment figures in those regions. On the other hand, widespread approval may also create widespread danger. Given these potential incentives and disincentives, it remains to be seen in which direction the field is headed. Regardless, this is certainly something that will be on headlines across America for decades to come.

 

Sources:

http://www.nrdc.org/energy/gasdrilling/?gclid=CMTW2P2ToK4CFYeK4AodnkNL5w –The Natural Resource Defense Council, an environmental lobbyist, with an entire page dedicated to the responsible implementation of hydrofracking technology.

 

http://www.theoec.org/Fracking.htm?gclid=CMT49f-XoK4CFUlN4Aod1VkY4A — Another lobbyist group, this one from Ohio and outright opposed to the spread of hydrofracking.

\”Fracking Hell\” a documentary by the BBC on hydraulic fracturing in western Pennsylvania

 

Posted in Uncategorized | 1 Comment

Generator Lab

Experiment Equipment

If there’s only one thing that can be learned about science from this class, it must be that this is the most “hands-on” of all the major subjects taught at Suffolk. Our most recent lab did not involve the robot pals we made several weeks ago, but that hardly prevented us from using innovate educational technology to enhance the learning process. This week, in order to find out more and promote a greater general comprehension of Faraday’s Law of Induction, we performed another experiment. The tools utilized this week included an altered flashlight that contained an electromagnetic coil, the chasis/computer-body of the robots made famous in previous labs, and the software Labview.
The lab was to operate as thus: as we shook the “generator”/flashlight, the magnet inside would pass through the coil, thereby conducting an amount of electricity that was measured by the robot-chasis and recorded in Labview; our challenge was to determine the differences in electricity generation while increasing and decreasing the shaking of the generator. Would increasing the number of shakes likewise increase the electricity output, or would an inverse relationship develop?

Our results (number of shakes on the left/sum of the squares on the right):

46 28264
80 26148
56 19748
86 16612
96 9216

As these results indicate, we were unable to discern a direct correlation. In the early sets, it does seem that increased shaking produced greater electricity, but our two highest shake totals invalidated such a conclusion. Similarly, there is nothing in the data set that would indicate an inverse relationship. It is likely that such inconclusive data is the result of nothing more and nothing less than human error. We were charged with mentally counting the number of shakes in each trial, and as the speed increased, this proved a daunting assignment. It would also be proper to note that we also experienced some technical difficulties along the way, as we had to repeat at least one test when the wires attaching the generator the rest of the system became disconnected.
Hopefully our failures here have in no way disrespected the legacy of Michael Faraday. We were unable to validate his law, but given the attention accorded him and the science backing his assertion, such is entirely unnecessary.

Posted in Uncategorized | Leave a comment