That talk by ExxonMobil CEO Rex Tillerson at the Council on Foreign Relations that Gristmill linked to earlier today is a stunning demonstration of how to sow confusion and delay. It’s worth deeper analysis. So let’s dig in!
It’s very long, so we’ll summarize some sections and zero in on a couple of key passages. You can read the whole thing here.
Paragraphs 1-6, in short: Energy prices sure go up and down a lot! But we keep finding more fossil fuels when we need to.
Next 3 paragraphs: Boy, there was a lot more natural gas in the shale here in North America than we expected.
Next 6 paragraphs: Let’s all say “energy security” rather than “energy independence,” OK? Exxon is a multinational, and I want everyone to be friends and not worry about where their oil comes from as long as it keeps coming.
Here’s where Tillerson starts to gets interesting. Let’s quote his original and then translate:
Ours is an industry that is built on technology, it’s built on science, it’s built on engineering, and because we have a society that by and large is illiterate in these areas, science, math and engineering, what we do is a mystery to them and they find it scary. And because of that, it creates easy opportunities for opponents of development, activist organizations, to manufacture fear.
Translation: You thought those people out there sounding an alarm about climate change were scientists? Forget it. We here at Exxon, we’re the scientists. And all those people with fancy degrees and titles who have been desperately trying to teach the U.S. public about global warming? They’re illiterates! We’re the clean guys in white coats; they’re the dirty “manufacturers” of fear.
And so as these technologies emerge, we know the immediate response from certain parts of interested parties out there is going to be to manufacture fear because that’s how you slow this down. And nowhere is it more effective than in the United States. And so that’s — the pace at which these things occur oftentimes is our ability to deal with the manufactured fear, our ability as an industry, working with well-intended regulators and policymakers to address the fears.
Translation: I am a dispassionate man of reason. Forget that I run one of the richest corporations in the world. I am not an “interested party.” The interested parties are all those illiterate, fear-mongering activists who are getting filthy rich off their fabulously wealthy nonprofit activities.
It requires a lot of education, requires taking an illiterate public — illiterate in the sciences, engineering and mathematics — and trying to help them understand why we can manage these risks. And that’s a very intensive, almost one-on-one process — town by town, city council by city council, state by state. So it takes a while. And we’re not particularly aided in our efforts by the broad-based media, because it’s a lot sexier to write the fear stories than it is to write the here’s-how-you-manage-it story.
Translation: Do not think that we buy advertisements and pay lobbyists in order to influence public policy in our favor. At Exxon, we re having one-on-one conversations with our community. Sadly, journalists sometimes help out those fearmongering ignoramuses by repeating their lies. So we have to spend lots of money setting the record straight.
Now, that’s just a fact, it’s not a complaint. But it’s part of why do things take so long. Well, that’s one of the reasons it takes us a long time to get the policy solutions, because it all becomes then a political process instead of a scientific process.
Translation: If only we could leave energy policy safely in the hands of scientists. Wait, maybe that’s not the best idea.
There are important questions about the things that people worry about, and we have an obligation to address them, and we devote a tremendous amount of effort in addressing those. But I think if you look at the technologies that are front and center today around the shale resources — hydraulic fracturing, horizontal drilling, the integration of those technologies, how we drill these wells, how we protect fresh water zone, how we protect emissions — we have all of that engineered. And as long as we as an industry follow good engineering practices and standards, these risks are entirely manageable. And the consequences of a misstep by any member of our industry — and I’m speaking again about the shale revolution — the consequences of a misstep in a well, while large to the immediate people that live around that well, in the great scheme of things are pretty small, and even to the immediate people around the well, they could be mitigated.
Translation: Accidents don’t happen if you do things right, and at Exxon, we always do things right. And even if there is an accident with fracking, which sometimes is done by people who don’t work for Exxon who might not do everything right, it will only wreck the lives of a limited number of people in a small number of communities. So who cares?
These are not life-threatening, they’re not long-lasting, and they’re not new. They are the same risks that our industry has been managing for more than 100 years in the conventional development of oil and natural gas. There’s nothing new in what we’re doing, and we’ve been hydraulically refracturing (sic) wells in large numbers since the 1960s; first developed in 1940. So this is an old technology just being applied, integrated with some new technologies. So the risks are very manageable.
Translation: If you look at the history of our industry, why would anyone worry? It’s not as if there have ever been any accidents, right?
The fears are real. We don’t discount that people’s fears are their fears. We have to address that. We want to address it with sound science, we want to address it with real data, and somehow we have to overcome the manufactured fear which gets most of the headlines.
Translation: The fears aren’t real! But unfortunately the U.S. still has elections, and the government can still make trouble for us. So we have to pretend to take public fears seriously. After all, if we lose a few towns here and there, you and I here at this elite conference understand that that’s an acceptable risk — but the illiterate masses out there might get really upset.
There is much, much more in this speech, but that’s enough for now. OK, almost enough. Here’s one last bit from the Q&A at the end.
QUESTIONER: Hi, I’m David Fenton. Mr. Tillerson, I want to talk about science and risk, and I agree with you that’s the way we must proceed. So, as you know, it’s a basic fact of physics that CO2 traps heat, and too much CO2 will mean it will get too hot, and we will face enormous risks as a result of this not only to our way of life, but to the world economy. It will be devastating: The seas will rise, the coastlines will be unstable for generations, the price of food will go crazy. This is what we face, and we all know it.
Now — so my question for you is since we all know this knowledge, we’re a little in denial of it. You know, if we burn all these reserves you’ve talked about, you can kiss future generations good-bye. And maybe we’ll find a solution to take it out of the air. But, as you know, we don’t have one. So what are you going to do about this? We need your help to do something about this.
TILLERSON: Well, let me — let me say that we have studied that issue and continue to study it as well. We are and have been long-time participants in the IPCC panels. We author many of the IPCC subcommittee papers, and we peer-review most of them. So we are very current on the science, our understanding of the science, and importantly — and this is where I’m going to take exception to something you said — the competency of the models to predict the future. We’ve been working with a very good team at MIT now for more than 20 years on this area of modeling the climate, which, since obviously it’s an area of great interest to you, you know and have to know the competencies of the models are not particularly good.
Now you can plug in assumptions on many elements of the climate system that we cannot model — and you know what they all are. We cannot model aerosols; we cannot model clouds, which are big, big factors in how the CO2 concentrations in the atmosphere affect temperatures at surface level. The models we need — and we are putting a lot of money supporting people and continuing to work on these models, try and become more competent with the models. But our ability to predict, with any accuracy, what the future’s going to be is really pretty limited.
So our approach is we do look at the range of the outcomes and try and understand the consequences of that, and clearly there’s going to be an impact. So I’m not disputing that increasing CO2 emissions in the atmosphere is going to have an impact. It’ll have a warming impact. The — how large it is is what is very hard for anyone to predict. And depending on how large it is, then projects how dire the consequences are.
As we have looked at the most recent studies coming — and the IPCC reports, which we — I’ve seen the drafts; I can’t say too much because they’re not out yet. But when you predict things like sea level rise, you get numbers all over the map. If you take a — what I would call a reasonable scientific approach to that, we believe those consequences are manageable. They do require us to begin to exert — or spend more policy effort on adaptation. What do you want to do if we think the future has sea level rising four inches, six inches? Where are the impacted areas, and what do you want to do to adapt to that?
And as human beings as a — as a — as a species, that’s why we’re all still here. We have spent our entire existence adapting, OK? So we will adapt to this. Changes to weather patterns that move crop production areas around — we’ll adapt to that. It’s an engineering problem, and it has engineering solutions. And so I don’t — the fear factor that people want to throw out there to say we just have to stop this, I do not accept.
I do believe we have to — we have to be efficient and we have to manage it, but we also need to look at the other side of the engineering solution, which is how are we going to adapt to it. And there are solutions. It’s not a problem that we can’t solve.
Translation: Yes, global warming is real. Carbon emissions really do boost temperatures. But nobody knows by how much — that’s impossible to predict. So what the hell? Let’s just take that risk of apocalypse. The consequences will be manageable — for us here at ExxonMobil. As for the human race? It will just have to adapt! And you can count on us engineers to help you out with that. After all, by that time we’re going to need a new line of business.
Read the original here: ‘Stand back, I’m going to try science’: Inside the brain of – Grist
When we think technology, we mostly relate it to the internet. The computer era has certainly transformed many areas in society. But, for every action, there is an opposite and equal reaction. In spite of all the positive influences technology has brought on society, we cannot ignore the other side of the coin. The world has fallen captive to this lifestyle and quickly shunned the old ways. Of course, considering the ease and efficiency in work production, communication, and economic growth, the expertise afforded is highly beneficial to all. Nonetheless, we cannot ignore the long-suffering effects that come with computer literacy and internet exposure.
Let’s consider for instance the most obvious result of technological introduction, the loss of jobs. In the haste to realize more output and greater returns, many companies have done away with a lot of their employees. Couple this with the sudden economic downturn and it’s clear how much of an impact this has had on hundreds of thousands of people worldwide. Of course there are others who have benefited largely from technology, which clearly implies that there is a disparity. Irrefutably, technology has brought about a lack of balance in society whereby, there are those who profit on one hand while others undergo financial and emotional distress.
Crime is inevitable in most every aspect of society. With the introduction of technology however comes a new level of criminal activity. The internet allows for numerous transactions and communication to take place on a daily basis. These occur globally, which enable businesses to carry out operations faster and most anywhere in the world. The one impediment however, is online villains. They have discovered ways to steal people’s identities and embezzle money by manipulating keystrokes to get personal financial information. Additionally, they spread malware as well as hack into websites and damage vital data. Worst off are sexual molesters who prey on the innocent. Though a lot of companies are working tirelessly to come up with software that will secure online activity, many people have unfortunately fallen victim to internet crimes.
It is worth noting that the impact technology has on society is both beneficial and detrimental, though not in the same proportion. If humanity is to fully enjoy all aspects of this new era, some sort of balance has to be established. It cannot be something that profits a select few and takes away from others. We cannot afford to eradicate our natural way of life and let technology infiltrate every element. We need to be the ones utilizing technology and not the other way round.
Follow this link: The Effects of Technology on Society
WASHINGTON, June 19, 2012 /PRNewswire/ — The National Assessment of Educational Progress (NAEP) is leading the way by measuring how well students apply their understanding of science in real-life contexts. The Nation’s Report Card Science in Action: Hands-On and Interactive Computer Tasks from the 2009 Science Assessment marks the first time that both tasks were included as part of the NAEP science assessment.
Today’s results reveal that America’s fourth, eighth, and 12th graders can conduct science investigations using limited data sets, but many students lack the ability to explain results. The report shows that students were challenged by parts of investigations requiring more variables to manipulate, strategic decision-making in collecting data, and the explanation of why a certain result was the correct conclusion.
The new interactive computer tasks and updated hands-on tasks that involve more open-ended scenarios were administered as part of the 2009 science assessment by the National Center for Education Statistics to a nationally representative sample of more than 2,000 students in each of grades 4, 8 and 12. The findings provide important insights for educators and policymakers who are looking for academic approaches that support careers in science, technology, engineering, and mathematics (STEM) fields, and encourage scientific inquiry.
“Science is fundamental to education because it is through scientific inquiry that students understand how to solve problems and ultimately how to learn,” said David Driscoll, chairman of the National Assessment Governing Board, which sets policy for NAEP. “So it’s tragic that our students are only grasping the basics and not doing the higher-level analysis and providing written explanations needed to succeed in higher education and compete in a global economy.”
The purpose of using hands-on and interactive computer tasks in testing is to determine whether students can solve problems as a scientist would and require students to perform actual science experiments. Interactive computer tasks require students to solve scientific problems in a computer-based environment, often by simulating a natural or laboratory setting.
“This innovative format allows for a richer analysis than a paper-and-pencil test,” Driscoll said. “Interactive computer tasks allow us to more deeply examine students’ abilities to solve problems because the tasks generate much more data.”
Only 53 percent of 12th graders reported that they were enrolled in a science course, and only 28 percent reported writing a report on a science project at least once a week. Ninety-two percent of fourth graders and 98 percent of eighth graders had teachers who reported doing hands-on science activities with students at least monthly. Thirty-nine percent of fourth graders and 57 percent of eighth graders had teachers who reported having at least a moderate emphasis on developing scientific writing skills.
The assessment measures science skills in a number of ways. Some questions use a model known as “predict-observe-explain” to examine students’ ability to combine their science knowledge with real-world investigative skills.
To correctly predict, students had to provide an accurate description of what might happen in a situation. For instance, when asked what kind of sunlight conditions were needed for a sun-loving plant and a shade-tolerant plant, 59 percent of fourth graders showed understanding that different plants have different sunlight needs.
Through the observe phase, students watched what happened as they conducted their experiments. Eighty percent of fourth graders made straightforward observations and tested how fertilizer and sunlight affected plant growth, but only 35 percent could perform a higher-level task that required them to make decisions about the best fertilizer levels for a sun-loving plant.
Students were then asked to explain what they had observed by interpreting data or drawing conclusions. Across all grade levels, a majority of students could observe, but far fewer could predict or explain. In fourth grade, fewer than 50 percent of students could explain why they selected a given fertilizer amount to support plant growth and use evidence to support their answer. At grade 8, 88 percent of students could correctly identify which liquid flowed at the same rate as water at a given temperature, while only 54 percent could support this answer with a written explanation of the evidence.
At twelfth grade, 64 percent of students could recommend the site for a new town based on information provided about water quality, while 75 percent of students could perform a straightforward investigation to test the water samples and accurately tabulate data. But only 11 percent were able to provide a valid recommendation and support their conclusions with details from the data. [Click for details on the plants task and water systems task.]
More highlights from Science in Action include:
Overall achievement gaps
- There are gaps in average scores for all tasks between students from low-income families (those eligible for free and reduced-price lunch) and those from higher-income families.
- There are gaps by race/ethnicity. At all grade levels, white and Asian/Pacific Islander students outscored their black and Hispanic peers.
- At grades 4 and 12, Hispanic students scored higher than their black peers on interactive computer tasks and hands-on tasks.
- Female students outscored males on the hands-on tasks, but males scored higher on the traditional paper-and-pencil assessment. There was no gender gap for interactive computer tasks.
- Seventy-one percent of students could correctly select how volume changes when ice melts into water, but only 15 percent could support this conclusion with evidence from the investigation.
- Overall, students earned about 42 percent of the total points available from the questions they attempted on the interactive computer tasks.
- Overall, students earned about 47 percent of the total points available from the questions they attempted on the hands-on tasks.
- Eighty-four percent of eighth graders could correctly test how much water flowed to different soil samples during a simulated laboratory test.
- Overall, students earned about 41 percent of the total points available from the questions they attempted on the interactive computer tasks.
- Overall, students earned about 44 percent of the total points available from the questions they attempted on the hands-on tasks.
Now, please. read the exciting conclusion of this article: The Nation’s Report Card Releases Results from Innovative Science Assessment
Posted: Jun 21st, 2012. Greener Nano 2012: Nanoinformatics tools and resources workshop. (Nanowerk News) The goals of the Nanoinformatics Tools and Resources workshop at Greener Nano 2012 are to establish a better understanding …
Read the original here: Greener Nano 2012: Nanoinformatics tools and resources workshop
Nano Patents and Innovations, Thursday, June 21, 2012
ETH-Zurich researchers have developed an economic, fast and reproducible method for printing tiny structures in a way similar to printing art by an ink-jet printer. Now they are planning a spin-off.
SEM images: Patrick Galliker / ETH Zurich)
A line appears on the monitor and gets longer within seconds. It bends off at a right angle, changes direction several times and crosses itself on a couple of occasions until a tangle of lines emerges. Then the line grows more slowly, appears darker, stops and darkens further in a dot of a consistent size. Then it continues: a line, another dot, line, dot, line, dot.
What may sound a bit like Morse code is actually a demonstration of a new technique that ETH-Zurich researchers have developed at the Laboratory of Thermodynamics in Emerging Technologies. The method enables them to print the tiniest of structures on a micro- and nanoscale.
Using this printing method, ultrafine particles are transferred onto a surface from a capillary in a targeted fashion by way of an electrical field. Depending on how long material accumulates on the same spot, the structure grows taller, producing a nano-tower. If doctoral student Patrick Galliker, who was instrumental in developing the printer, allows them to get ever taller, they can clearly be seen toppling over on account of their proximity to the capillary. For the demonstration, Galliker uses controls similar to those found in computer games. If the researchers automatize the nano-printer using special software, it can produce the little towers autonomously, uniformly and without any connecting lines whatsoever. They can also make towers that are slightly bent or lean two of the towers against each other to form a sort of tiny arch, explains Galliker using photos he took of the structures.
The printing takes place with nano-particles of a wide variety of materials that are placed in solvents. During printing, the nano-particles accumulate next to each other according to the laws of physics. The solvent evaporates and the nano-structures, which can be smaller than 100 nanometres, are ready. Manipulating light with nano-structures
The ETH-Zurich researchers envisage a wide range of possible applications for their new method. It is just the ticket for applications in optics, they explain. After all, light interacts differently with nano-structures than with larger objects. Surfaces that have been modified with nano-structures “manipulate the light”, as Galliker puts it. These surfaces can absorb, concentrate and conduct light instead of reflecting it. Acting as mini-antennae, the minuscule structures thus soak up the light, which falls into a kind of trap before ideally being conducted to where it is needed.
This could be used to increase the efficiency of thin-film solar cells by capturing the light and channelling it directly towards the active layer, for instance. Until now, such solar cells did not use all the light as they reflected part of it and let another part escape unused. Camouflage suits with such surfaces are conceivable, explains Dimos Poulikakos, professor of thermodynamics and head of the research group.
Moreover, using such nanostructures, new kinds of faster, more selective and highly sensitive detectors and sensors might be feasible. The nanostructures could also be used in special light microscopes in which nanoparticles increase fluorescence, Poulikakos adds, enabling the tiniest of objects, such as individual molecules, to be observed. And, of course, the nano-printer could be employed wherever material needs to be applied on a nanoscale in a targeted fashion, such as in the production of modern microprocessors: imagine, a CPU printed on the spot! Economic and reproducible method
Read the exciting conclusion and other articles on nano tech right here: Nano Patents and Innovations: New Printing Method For …
Organizations of higher learning are still not in agreement of what is meant by many technical levels, such as application technological innovation and PC technological innovation. These areas, along with it, computer, and PC technological innovation technological innovation, are simply too new. Therefore, what one school or employer thinks as a requirement PC technological innovation may be viewed by another as application technological innovation? In the start, computer systems were hard-wired to perform a certain function. The individual did little more than impact a button. Allowing greater individual management led to the progression of coding dialects and compilers to change “normal” terminology into PC terminology. On-line began to come into its own with the progression of the laptop or PC. Simple dialects such as BASIC gave more management to the average individual. This laid the fundamentals for the application professional, who not only understands the program but the physical abilities of the hardware. One method of analyzing the variations between application technological innovation and PC technological innovation is to consider how most models were managed by the first PCs.
Often, changing and printing device functions such as typeface dimension, number of duplicates, or paper dimension required coming into the proper printing device sequence in DOS. The individual may enter the details in the application or in the printing device dialogue box. Some institutions require CS degrees to take Microsoft Office as their first coding course. By the same small, application technical engineers may be finished without a simple understanding of wireless technological innovation. Not too many years ago, PC technological innovation roles were often filled by those with other levels, such as domestic electricians. Software technical engineers were usually those with coding skills, many of whom held no degree at all but were self-taught.
Personal Computer technology is generally the research of concepts of calculations and how they apply to Personal Computer. An on-campus of on the Internet computer technology level can help you get certified for a profession in the processing or technology industry. Online applications are generally more practical than off-line applications as one can research whenever they want to. Moreover, on the Internet levels are usually less expensive than regular applications. This level needs four decades of research. Programs provided in this level are related to computer technology, arithmetic and technology. Before generating this level learner can opt for an affiliate level or a document in this area. After seeking their bachelors level in this area learner can go onto get an Expert of Science level in computer technology. There are some organizations which allow learners to earn this level on the Internet.
In this program learner get innovative knowledge and focus on a particular are like computer design, human-computer connections or synthetic intellect. The choice of their profession determined expertise interest. Students can also opt for a PhD in computer technology. Experts can be a part of public use or private companies as specifications for their services are present in both of these companies. Illustrations of profession one can engage in after seeking a level in this area include, program specialist, system professional and data source manager. Moreover, job possibilities for computer experts are expected to grow in the future due to the use of computer systems in various areas of life. Attractive incomes are provided to computer experts.
Go here to read the rest: A Comparison of Software Engineering and Computer Science
What does it take to create a stadium today? How much has stadium technology changed in the last ten years? What is being planned for the stadium of the future.? Has the stadium experience out lived its usefulness in todays modern world?
When we think of spectacle we think of many things. When Quantum POP uses spectacle in the context of this piece we mean the act of placing your physical body in a place where the venue itself is as much a part of the experience as the event taking place.
The modern outdoor and to a certain extent the indoor stadium is born of a design most referenced in the Roman Coliseum. . It offers an experience found in no other place. Where you can be absorbed in the immensity of the experience and take away from it a physical imprint to all of your senses.. It is something that everyone wants to do at least once in their life. Ideally from the point of view of those in the business of providing this unique form of entertainment, no matter how many times you do it will be an event. It has the potential to offer satisfaction available in no other place. That sensory satisfaction available nowhere else is what will get you to go and freeze your tukus at the stadium instead of staying at home in your comfy pajamas, pay $9 for a beer and fight the traffic to do it. For as little as $15 a seat on up to the stars you can have that spectacle experience in almost any major and minor urban area in the world.
The stuff that goes into the stadium design of today is much different from the designs of even 10 years ago and nothing like the designs of 20 years ago. New materials and considerations are involved. Comfort is becoming an important part of the experience. Roofs on your favorite ballpark are much more the norm and the technology that makes it practical is also designed to increase the spectacle.
The designs of companies like Vector Foiltec and others makes it possible to put state of the art roofs on what were once strictly outdoor stadiums like never before. Not only can such a sight be seen nowhere else they are also becoming practical in ways not possible before. It is now possible to construct spectacle roofs for sporting events using a poly and ethylene vinyl material that welcome not eliminate the suns rays to such an extent that natural grass can grow inside. Players love natural grass as most of the sports played in stadiums today were invented on natural turf. Natural turf decreases injuries and does not affect the play of the ball and performance of the players as the old artificial turf used to do.This is a relatively new feature of stadiums with roofs and is made possible with new materials and construction techniques.
But wait, do you think that you just go into your new metro stadium and plant some grass and hope that everything will work out fine.. Of course not. Today you need to have a grass turf system. This is especially important if you do not have a roof over your stadium.
Desso Sports Systems now offers a natural sporting and outdoor concert turf concept that pays for itself by allowing stadium owner operators the ability to withstand torrential rains before the game or even the threat of an outdoor rock concert with tens of thousands of fans the day before a sporting event and still have optimum conditions for ball playing on the field the next day. Real grass and soil are involved along with layers for drainage and growth that are truly state of the art green.
In part 2 our our Stadia Expo Series we will explore more of the companies and products now and invisioned for the future of the Spectacle event business.