id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2016-40/3982/en_head.json.gz/14932 | Home · News · Kermit the Kodok by Location South Bali Denpasar
Kermit the Kodok
Bali Home to Newly Discovered Species of Miniature Frog
Amir Hamidy, a researcher from the Indonesian Institute of the Sciences (LIPI) has issued a paper declaring the discovery of a new species of frog in Bali.
The discovery, announced on Sunday, June 16, 2013, and reported by Seputarbali.com, identified the new species of frog has having a miniscule size of only 16-17 mm or approximately the size of the human fingernail. Amir and his fellow researchers have given the name of Microhyla orientalis to the new species of reptile.
A mitochondrial DNA analysis of the “mini” frogs shows the M. orientalis is a close relative of the M. mantheyi, M borneensis and M. malang. All three of these genetic “kin-folk” are members of the sub-group from the species M. borneensis.
Frogs from the sub-group of M. borneensis share the characteristic of reproducing in areas of still water. The specific species of M. borneensis has the unique characteristic of breeding inside the Tropical Pitcher Plant.
The sub-group of M. borneensis is endemic to Thailand, Sumatra and Kalimantan. The species of M. Orientalis is the most eastward dwelling member of the group, and fittingly given the name of “orientalis” due to its known habitatation on the Island of Bali.
The June 14th edition of Zootaxa carries the analysis of Masafumi Matsui from Kyoto University who identified the M. orientalis as uniquely having a striped back, and black stripes on its sides extending from the eyes down half its body and a round snout.
So far this new species of frog has been found in the area of Wongaya Gede and Batukaru in Bali, living in rice fields at an elevation of between 435 – 815 meters above sea level. The M. orientalis is most often encountered at Wongaya Gede in July, while is most prevalent in Batukaru in the early weeks of August. | 科技 |
2016-40/3982/en_head.json.gz/14974 | Endomondo and The Coca-Cola Company Join Forces to Encourage More People to Get Active and Have Fun Along the Way
Press Release Oct. 24, 2012 at 1:57 pm
More Global Strategic Alliance Marks the Third Collaboration for The Coca-Cola Company
NEW YORK & ATLANTA–(BUSINESS WIRE)–Leaders from The Coca-Cola Company this morning took to the running trails of New York to celebrate the launch of a strategic alliance with Endomondo (www.endomondo.com), a social fitness community with more than 12 million users worldwide. The newly formed collaboration combines the global reach of Coca-Cola (through its POWERADE® brand) with Endomondo’s first-of-its kind mobile app and social network to bring communities around the world together with the tools they need to reach their fitness goals.
Endomondo brings together cutting-edge technology, including location-based GPS, with social networking capabilities to motivate users toward more fun and greater fitness. Under the new agreement, Coca-Cola will help Endomondo grow its existing user base and enter into new markets and territories, while helping develop the app’s functionality and drive innovation.
“We were immediately attracted to the dynamic team at Endomondo and their vision and commitment to creating a world where everybody exercises,” said Emmanuel Seuge, Head of Worldwide Sports and Entertainment Marketing, The Coca-Cola Company. “We believe in the power of Endomondo’s vision, application and most importantly – its people.”
The Company’s first contribution can be seen through a new POWERADE® hydration feature, which educates users about how much they need to drink during and after each workout for optimum hydration levels to enhance performance. Research has shown that dehydration resulting in as little as a 2 percent decrease in body mass can begin to reduce both physical and mental performance. By being properly hydrated athletes can perform at higher levels. Endomondo’s new hydration feature supports effective functioning of the body, and promotes general health and well-being.
“Through the innovation and network of Endomondo, we have an incredible opportunity to continue to establish Powerade as a true sports brand,” explained Seuge. “The application will allow sports men and women from around the world to motivate and compete with each other while they strengthen and condition to become better all-round athletes.”
Endomondo joins music licensing start-up Music Dealers and digital music service Spotify in The Coca-Cola Company’s strategic alliance portfolio as the first global sports-related endeavour. Similar to The Company’s pacts with Music Dealers and Spotify, the collaboration is rooted in shared value, an evolving approach to partnerships, leveraging the scale and reach of The Coca-Cola Company to achieve bottom line efficiencies for both parties. Coca-Cola and Endomondo expect to finalize an agreement that will enable Coca-Cola to become an investor.
“We are incredibly excited to announce Endomondo’s first-ever global brand collaboration,” said Endomondo Co-founder Christian Birk. “Our relationship with The Coca-Cola Company will pull together the innovative resources from both partners to create new, unique experiences that helps people around the world have fun while leading a more active lifestyle.”
To celebrate the collaboration, Endomondo is launching the Powerade Challenge. The challenge is all about working out – the more minutes you work out, the better your chances of being one Endomondo’s randomly selected winners. Join the challenge by visiting http://www.endomondo.com/challenges.
About The Coca-Cola Company
The Coca-Cola Company (NYSE: KO) is the world’s largest beverage company, refreshing consumers with more than 500 sparkling and still brands. Led by Coca-Cola, the world’s most valuable brand, our Company’s portfolio features 15 billion-dollar brands including Diet Coke, Fanta, Sprite, Coca-Cola Zero, vitaminwater, Powerade, Minute Maid, Simply, Georgia and Del Valle. Globally, we are the No. 1 provider of sparkling beverages, ready-to-drink coffees, and juices and juice drinks. Through the world’s largest beverage distribution system, consumers in more than 200 countries enjoy our beverages at a rate of 1.8 billion servings a day. With an enduring commitment to building sustainable communities, our Company is focused on initiatives that reduce our environmental footprint, support active, healthy living, create a safe, inclusive work environment for our associates, and enhance the economic development of the communities where we operate. Together with our bottling partners, we rank among the world’s top 10 private employers with more than 700,000 system associates. For more information, visit www.thecoca-colacompany.com, follow us on Twitter at twitter.com/CocaColaCo or visit our blog at www.coca-colablog.com.
About Endomondo
Endomondo was founded in 2007 by fitness enthusiasts. Five years later, the app can be found in 200 countries, and gets up to 25,000 new users a day. Endomondo brings together cutting-edge technology, including location-based GPS, with social networking capabilities to motivate users toward more fun and greater fitness. It turns your smartphone into your ultimate social fitness tool.
Using GPS, the app tracks route, distance, duration, split times, calorie consumption, and more, while providing audio feedback on performance. The Endomondo app offers a full history with previous workouts and statistics, as well as a localized route map for each workout.
Endomondo Sports Tracker incorporates community and social features that allow users to challenge friends and share results. Available on seven mobile platforms, Endomondo operates on almost all GPS phones.
Joining Endomondo’s Social Fitness Network is free and the app can be downloaded for free. A premium version of the app is available for $4.99.
Coca Cola Company
http://www.coca-cola.com
No description provided.
Update / Edit Listing
neXstep beverages, LLC
Blox Distributing, LLC
http://www.indrinks.com
Hard Cider Contract Production including Bottli... | 科技 |
2016-40/3982/en_head.json.gz/15046 | Pollies earmark Western Sydney as state's data nerve
Almost half a billion dollars invested in data centre construction in Sydney's West could form the building blocks for a suburban technology industry that provides jobs to the local area's technology trainees.The area has already hosted the recent launches of the $200 million "Aurora" HP data centre in Eastern Creek, Blacktown, and the $250 million Digital Realty data centre in nearby Erskine Park, Penrith, where US company Rackspace's Australian subsidiary is based. Both centres are estimated to create 450 construction jobs, and Digital Realty expects to spend more than $200 million on local labour and projected construction costs. Woolworths already has a data centre in Erskine Park.
Western Sydney is fast becoming the state's technology centre. Additionally, new data centres are also being built in Alexandria, near Sydney's CBD. And the NSW Government has contracted Metronode to build a Tier III data centre in Silverwater, also in western Sydney and another $27 million centre in the Illawarra, south of Sydney, by the end of the year.The myriad computing infrastructure of all state departments and agencies will be replaced with these two centres, which will collectively provide up to 9 MegaWatts of IT load. The exercise will create 250 jobs.Fairfax Media's IT Pro asked NSW Premier Barry O'Farrell whether this investment could seed the growth of a Western Sydney technology industry."Absolutely," O'Farrell said at the recent launch of the Digital Realty centre in Penrith.
"I do know there are enormous enrolments at TAFE, university and other levels in the ICT sector here, and delivering a mixed economy in Western Sydney is part of our ambition to provide more local jobs."He pointed to the Industry Action Plan produced by the NSW Digital Economy Industry Taskforce last September, which identified the demand among banks, financial institutions, and other professional service organisations, for reliable and secure data centres and co-location centres. The blueprint attracted strong criticism from sections of the technology industry at the time."In addition to the direct investment and jobs created by a new data centre, availability of these facilities is often a determining factor in attracting investment by major companies to NSW, and is essential in supporting the business requirements of NSW industry," the taskforce wrote in the report.Once completed, however, data centres employ very few people in the day-to-day management of the facilities. A spokesman for the NSW Deputy Premier and Minister for Trade and Investment Andrew Stoner said however it "belies the longer term benefits for local economies and for the wider economy of data centres' capacity to help companies become more innovative, more competitive, delivering better services and achieving savings to allow for further investment."Federal Labor MP Ed Husic said his Western Sydney electorate of Chifley was home to more than 300 ICT businesses.He said he wanted to establish trade training centres in high schools, and a co-operative research centre at the University of Western Sydney school of computing and engineering, to demonstrate the potential of technology to young people."I mean data centres in and of themselves aren't the be all and end all as to what can be done out here, what I'm really really excited about is we do have a base of young people out here who are very interested, very tech savvy and we need to provide them a platform," Husic said."We should work with big players, Google, eBay, PayPal and the like, to effectively skill up local business to take advantage of what's on offer, and be able to give a boost to local economy, open up job opportunities, and through that process diversify the things we're able to do out here."He believed vast swathes of open, affordable, and industrial-zoned land, particularly in Blacktown, lends itself to further data centre development.At the Western Sydney Region of Councils AGM on October 25 last year, Blacktown Mayor Len Robinson cited the Aurora centre as the start of a possible new trend. The region's population is projected to grow by a million over the next 25 years, and he said the Blacktown ICT sector generated $139 million in 2011, with almost a third from hardware, software and web-design consultancies.The opportunity is growing, Robinson said.Husic's electorate includes Blacktown, the state's biggest municipality, and last year his discussions with the council's economic development arm canvassed ways to attract more data centres."I've been really encouraged by the new mayor, Len Robinson," Husic said."Even though we're on opposite sides of the political fence, Len has already opened up and said: 'this is something I want to work on'.""I'm hoping meet with him down the track to discuss this further." Follow IT Pro on Twitter 9 comments
New York Times hack linked to Australian internet company, Syrian Electronic Army fingered
IBM given 'dry run' in payroll system bid
Queensland Health payroll inquiry begins | 科技 |
2016-40/3982/en_head.json.gz/15047 | NanoTech's Nuvola NP-1 4K Streaming Media Player Demonstrated with Akamai Media & Delivery Solutions at NAB 2014 Affordable Solution Consistently and Reliably Delivers Four Times the Resolution of Full HD March 21, 2014 -- NanoTech Entertainment (OTCPINK: NTEK), a leader in the deployment and delivery of 4K Ultra HD, announced today that its award-winning Nuvola NP-1(tm) consumer 4K streaming media player will be a player component at the Akamai NAB Booth SL4525 in Las Vegas April 7-10. Akamai Technologies, Inc., the leading provider of cloud services for delivering, optimizing and securing online content and business applications, will feature the Nuvola player in its 4K streaming demonstration. Using advanced technologies, Akamai and NanoTech will demonstrate how 4K content can be uploaded to Akamai's global content delivery network (CDN) and streamed to a consumer's Ultra HD TV using NanoTech's Nuvola NP-1 4K streaming media player. The result is a compelling solution for broadcasters and other content owners who want to provide their customers with the 4K End-user Quality of Experience (QoE) they increasingly expect. Akamai Media & Delivery Solutions are built upon the Akamai Intelligent Platform(tm) of over 147,000 globally distributed servers that deliver up to 30 percent of all Web traffic. By spreading content that would otherwise be concentrated in a few locations, Akamai offers improved network efficiency, greater security and attack protection, as well as better service quality for end users by providing continuous uptime and enabling faster access and download speeds.The Nuvola NP-1 is a multifunction consumer device that can be used to decode a variety of video formats and resolutions including 4K Ultra HD, HD, SD and 3D movies. It can also be used to play state-of-the-art 3D video games. Based on the AndroidTM Operating System, the Nuvola provides fast, easy website access as well as access to thousands of video games and applications from NVIDIA(r) Tegra(r) Zone and the Google PlayTM Store."The Nuvola player's ability to stream 4K content complements the Akamai Intelligent Network's ability to deliver content at this ultra-high quality level," said Will Law, principal architect, media engineering, Akamai. In providing an end-point for streaming delivery, it offers a more flexible alternative to store-and-forward download solutions while the Android operating system simplifies player development and the small form factor facilitates deployment. We expect the Nuvola to accelerate residential 4K adoption."Availability, PricingNuvola NP-1, the world's first consumer streaming media player that supports 4K Ultra HD content, is currently in production and will be available for new customers this quarter for $299. For more information on the Nuvola 4K streaming media players, call 1.408.414.7355, email [email protected] or visit http://www.nuvola4k.com. About NanoTech EntertainmentHeadquartered in San Jose, CA NanoTech Entertainment is a technology company that focuses on all aspects of the entertainment industry. With six technology business units, focusing on 3D, Gaming, Media & IPTV, Mobile Apps, and Manufacturing, the company has a unique business model. The company has a diverse portfolio of products and technology. NanoTech Gaming Labs operates as a virtual manufacturer, developing its technology and games, and licensing them to third parties for manufacturing and distribution in order to keep its overhead extremely low and operations efficient in the new global manufacturing economy. NanoTech Media develops proprietary technology which it licenses to publishers for use in their products as well as creating and publishing unique content. NanoTech Media Technology includes the world's first 4K Ultra HD streaming solution. NanoTech Communications develops and sells proprietary apps and technology in the Mobile and Consumer space. Clear Memories is the global leader in 3D ice carving and manufacturing technology. 4K Studios creates digital 4K Ultra HD content using both licensed materials as well as original productions. NanoTech is redefining the role of developers and manufacturers in the global market. More information about NanoTech Entertainment and its products can be found on the web at www.NanoTechEnt.com. # # #"Safe Harbor" Statement: Under The Private Securities Litigation Reform Act of 1995: The statements in the press release that relate to the company's expectations with regard to the future impact on the company's results from new products in development are forward-looking statements, within the meaning of the Private Securities Litigation Reform Act of 1995. Since this information may contain statements that involve risk and uncertainties and are subject to change at any time, the company's actual results may differ materially from expected results.The NanoTech Entertainment logo, UltraFlix and Nuvola NP-1 are trademarks of NanoTech Entertainment. All rights reserved. All other marks are the property of their respective owners.
Related Keywords:Akamai, Nanotech Entertainment, 4K, Ultra HD, Content Delivery | 科技 |
2016-40/3982/en_head.json.gz/15165 | Microsoft Silverlight shines through
Google’s App Engine, a new version of Microsoft Silverlight and the 3G iPhone catch Mike Altendorf’s attention – but one really made an impression
A hat-trick of technology stories has caught my attention in recent weeks. The first one was Google launching its App Engine, essentially a competitor to Amazon’s Elastic Compute Cloud (EC2) web-based environment and Microsoft’s tranche of Live services. It’s an environment where you’re again using external systems for processing and storage. Interesting, but CIOs will quite rightly ask what the benefit is for them.
Frankly, I’m struggling to see the attractions of these so-called cloud-based services for bigger businesses. At the minimum, a CIO needs to know he’s in control of the business and, given that most hosting companies today struggle to offer service levels as it is, who is to say that these new entrants will do any better at providing services.
OK, so they have huge internal service capacity, but it’s often the case that the services you give to outsiders are not as good as what you can manage internally. The internal plumbing might be strong, but what about when you’re providing a service on a multi-tenant basis and have to ensure not just performance and continuity but also data privacy? Salesforce.com has developed multi-tenancy very successfully, I’ll -give you that, but all you’re doing there is loading your own data onto Salesforce’s application. It’s a cracking application, but very different from everybody running their own applications. And on Salesforce, there are very few people doing heavy customisation.
You can be suckered into thinking you can get rid of all this datacentre stuff but, if you’re a CIO, your job is to provide infrastructure for your organisation and manage that process, whether you’re outsourcing the day-to-day operations or not. I have yet to hear lots of proven outsourcing success stories in this area, and taking a vanilla version of a package such as Microsoft Exchange is a very different thing to what suppliers of cloud-based services are proposing. A CIO has to focus very carefully on business users and offer them a platform to innovate, to get projects from concept to cash. I’m not at all sure the cloud is the platform for that – not for now at least.
Apple Watch, iPhone 6 and iPhone 6 Plus in pictures
Apple reports huge profit but sales disappoint
iPhone 6 beating iPhone 6 Plus among enterprise users
Apple profit declines as iPad sales flatten
The new iPad Mini 3, iPad Air 2, Apple Pay and Mac Mini
Apple's partnership with IBM leaves CIOs hanging
A second interesting technology, and one I’m keener on, is Microsoft Silverlight 2.0, which I saw demonstrated recently. I’m a big believer in the rich internet and the idea that the browser has become a big constraint. The browser hasn’t even kept up with the connectivity that businesses and consumers have coming into their offices and homes. Now is the time to show how front-end technologies will have an impact on business applications of tomorrow, whether they’re internet-based or not. It’s about ergonomics, usability and presentation – developing amazing runtimes on laptops and desktops, of course, but also with very small footprints for phones. The issue has always been the ability to write the presentation layer in a way that maintains proper graphics. To see excellent Silverlight capabilities such as the Deep Zoom feature, the site to go to is http://memorabilia.hardrock.com. There, you’ll see hundreds of images in a catalogue, with gigabytes and gigabytes and gigabytes of storage loading smoothly as you’re searching around based on terms you type in. You can go right in and see a string on one of Eric Clapton’s guitars, for example, and the speed is amazing. CIOs might ask what Silverlight will do for them today, but it’s more of a pointer to the way people will navigate in the future. Incidentally, a lot of people will be watching to see whether Silverlight crushes Adobe’s AIR and Flex tools. I think this is one area where Microsoft and Adobe will co-exist because Adobe’s Flash and associated tools are such an embedded part of the internet.
The third thing that grabbed my attention was the kerfuffle about the imminent 3G iPhone. Having 3G is very important because we need faster wireless networks to do the navigation that I’ve been talking about with reference to Silverlight, together with auto-switching so you can find a faster network such as a Wi-Fi link where available.
3G will make a huge difference to the iPhone as the interface is already great, but the inhibitor to using it as more than a phone and music player is the ability to have decent surfing speed. The 3G iPhone should improve expansion into Asia, and also enable new video options, such as two-way video calls. Rumour has it that Apple may add a front camera and video chats to the iPhone’s feature set. The few minor current deficiencies of the iPhone, such as surfing and moving between applications, doesn’t take away from the fact that it’s still a great device. It’s a mystery to me why Nokia, Sony Ericsson and the rest can’t come up with something just as good.
Innovation, transformation and the Cloud | 科技 |
2016-40/3982/en_head.json.gz/15166 | Mobile E-Mailing Behavior Gets Riskier
By Reuters | Posted 10-21-2008
A California train engineer who was sending and receiving text messages was blamed last month for causing one of the worst railroad crashes in U.S. history that killed 25 people.
Despite such risks, many Americans send and receive text messages on mobile e-mail devices in dangerous situations, according to a survey released on Tuesday that showed 77 percent have used such a device while driving a moving car.
Forty-one percent said they have used a mobile e-mail device such as a BlackBerry while skiing, on horseback or riding a bicycle, said the survey commissioned by Neverfail, an Austin, Texas-based software company that provides protection for business data, operations and applications.
The engineer of a crowded commuter train was text-messaging from his cell phone seconds before his train skipped a red light and collided with a freight train near Los Angeles in September, killing 25 people, investigators found.
The Neverfail survey said the proportion of the corporate workforce using company-supplied mobile devices will grow to nearly 40 percent by 2010 from just under one-quarter now.
In the economic crisis workers may feel squeezed and under pressure to use their mobile devices even more, said Michael Osterman, president of Osterman Research of Black Diamond, Washington, which conducted the survey for Neverfail.
"People are going to have to do more than they are doing now," he said. "As people get laid off, the responsibilities of the company don't go away, but the people to do the work do."
Also, 11 percent of respondents said they have used such a device during a romantic moment, and 79 percent said they have used one in the bathroom, it said.
Eighteen percent have used one during a wedding, 16 percent during a funeral or memorial service and 37 percent during a graduation, it said.
The online survey was conducted August 4 through August 26, 2008, of 148 U.S. adults. The margin of error varied for each question but averaged plus or minus 5 percentage points.
Along the same topic, earlier this year the American College of Emergency Medicine warned people not to text message while walking, skating, riding a bicycle or driving. It said its members were noticing a rise in injuries and deaths related to sending text messages at inappropriate times.
A survey last year by the AAA travel and motorist group found nearly half of U.S. teen-agers sent text messages while driving.
In New York, a state legislator has proposed a bill to combat so-called "iPod oblivion" and fine pedestrians for crossing city streets while wearing portable media players.
Back to CIO Insight | 科技 |
2016-40/3982/en_head.json.gz/15197 | Obama Energy Policies Running Out of GasBy Roger AronoffMay 16, 2011TweetOne of the more important issues raised during the budget battle that nearly shut down the Federal government in April was over power given to the Environmental Protection Agency (EPA) by President Barack Obama to regulate greenhouse gases that they claim can contribute to global warming. This has led to renewed discussion on the validity of concerns about global warming, and the related issue of America ’s future energy sources. We have addressed the issue of global warming many times over the years at Accuracy in Media (AIM). In the mid 1970s, the big concern among so-called environmentalists was that we were heading toward a new Ice Age. The essence of that point of view was carried in a Newsweek article in its April 28, 1975 edition headlined “The Cooling World.” Here was the money quote: “The central fact is that after three quarters of a century of extraordinarily mild conditions, the earth’s climate seems to be cooling down. Meteorologists disagree about the cause and extent of the cooling trend, as well as over its specific impact on local weather conditions. But they are almost unanimous in the view that the trend will reduce agricultural productivity for the rest of the century. If the climatic change is as profound as some of the pessimists fear, the resulting famines could be catastrophic.” It wasn’t too long, 1988 to be specific, when that “almost unanimous” view shifted, and the problem had become catastrophic global warming. Larry Bell is a space architect and professor at the University of Houston, and author of the new book Climate of Corruption: Politics and Power Behind the Global Warming Hoax. Bell has worked with NASA on all aspects of mission planning for lunar programs, Mars programs, and orbital programs, including the international space station. He says that “politics is responsible for the global warming hoax, and, in reality, of course the climate warms and cools all the time—Climate changes all the time.”In an interview earlier this year with AIM, Bell said that “Change is what climate does. It’s measured, typically, in three-decade periods, although it didn’t take three decades from the time of the ’70s, when The New York Times and other organizations were reporting the next Ice Age coming, until Al Gore had his famous hearings in 1988, which declared not only that global warming was a crisis, but that we caused it.”Bell argues that the ways the temperature is measured are hardly reliable, but that even if the earth is warming, that might not be so bad. “Do [I] believe in global warming? I say, ‘Yeah, sure I do. I think it’s great! I think it makes plants grow, and it’s good for the rainforest—lots of carbon dioxide they can breathe! The Earth isn’t frozen! We can grow plants! Trade flourishes! Pyramids get built!’ Sure, I believe in global warming.”When asked if he accepts that there is a consensus among scientists that global warming exists and is caused by humans, he said that “everything affects everything, so to say that human activity doesn’t affect climate would be nonsensical. The question is, which activities, and how much? Can you even measure them? Can you separate them from other factors? I don’t think anybody can—I would maintain that nobody can.”The media were complicit in pushing the global warming hoax, calling skeptics “deniers,” as in “Holocaust deniers.” Newsweek used some form of the term “denier” 20 times in one 2007 cover story on global warming about those who don’t buy into the theory. They argued that people who doubted the Al Gore apocalyptic view of a coming age of massive flooding, unbearable heat, the extinction of polar bears and the melting of ice caps and glaciers, all as a result of mankind’s overuse of carbon-based energy and the carbon dioxide it generates, were somehow the moral equivalents of people who believe that the Nazi genocide of millions of Jews in Europe was exaggerated or did not even occur.There was much more. The idea of the consensus of scientists was long since shattered. Thousands of advanced-degree scientists publicly refuted both the science and the fear mongering behind global warming, which has in recent years come to be known instead as climate change. It’s an easier concept to sell, and it doesn’t matter if the earth’s temperature is rising or cooling, it is still climate change, and who can disagree with that? Marc Morano and the Committee for a Constructive Tomorrow (cfact.org) have put together ClimateDepot.com, a repository for everything related to the global warming movement, including documentation of those who were one time believers, and had become skeptics, non-believers, and yes, in many cases, deniers. Patrick Moore, one of the founders of the environmental group GreenPeace, said earlier this year, as quoted in the Glenn Beck blog, The Blaze, that global warming is a “natural phenomenon,” that there’s no proof of man-made global warming, and that “alarmism” is leading to bad environmental policies. He told Stuart Varney on the The Fox Business Network that “We do not have any scientific proof that we are the cause of the global warming that has occurred in the last 200 years...The alarmism is driving us through scare tactics to adopt energy policies that are going to create a huge amount of energy poverty among the poor people. It’s not good for people and it’s not good for the environment...In a warmer world we can produce more food.”When asked who is promoting man-made climate fears and what are their motives, he said that it is “a powerful convergent of interests. Scientists seeking grant money, media seeking headlines, universities seeking huge grants from major institutions, foundations, environmental groups, politicians wanting to make it look like they are saving future generations. And all of these people have converged on this issue.” He said: “There are many thousands of scientists’ who reject man-made global warming fears...It’s all based on computer models and predictions. We do not actually have a crystal ball, it is a mythical object.” The Obama energy policy has been upended by a series of events, and missteps: The ClimateGate scandal exposed the dishonesty and manipulation of data by key scientists who are among the leading proponents of Anthropogenic Global Warming (AGW); the BP Deepwater Horizon oil spill halted, or severely slowed down, offshore drilling; the Japanese earthquake and tsunami have caused us to re-visit the expansion of nuclear power; and the House changed hands, after the Pelosi-led 111th Congress passed Cap & Trade, a costly energy tax that died in the Senate.As The New York Times put it in a March 31st special section on Energy, Obama’s energy plan was a “complex structure [that] depended on an expansion of offshore oil drilling and nuclear power generation, creation of a trillion-dollar market in carbon pollution credits, billions of dollars of new government spending on breakthrough technologies and a tolerance for higher energy prices by consumers and businesses, all in the service of a healthier atmosphere and a more stable climate in future decades.”The Times noted that “one after another the pillars of the plan came crashing down. The financial crisis undercut public faith in markets. The Deepwater Horizon explosion and spill set back plans for offshore drilling by several years. The Japanese earthquake and tsunami, which led to a major release of radioactivity at the Fukushima Daiichi reactor complex, raised fears about nuclear power.” Failing to get his Cap & Trade legislation through the Senate, Obama turned to the EPA to implement the policy through the back door. He had shown his hand early in his administration when he chose as his “Green Energy Czar,” Van Jones, a self-described communist, who hinted at his plans for America: “the green economy will start off as a small subset. And we are going to push it and push it and push it... until it becomes the engine for transforming the whole society.” Only when it was revealed that he had also signed a petition indicating his support for the so-called “9/11 Truth movement,” was he booted out of the administration.The other issue is drilling for oil and gas. One of Obama’s stated goals, as has been every president’s, is ending our dependency on Middle East oil. But at the same time, he has severely restricted new drilling in this country, using the BP oil spill in April 2010 as the justification. At the same time the Obama team started up the 2012 re-election campaign in April, they claimed to be offering up new licenses for the rights to drill for oil by certain companies. But what they were really getting for the most part was the right to apply for licenses, and in some cases to resume drilling at old projects. In March, Obama said that “Oil production from federal waters in the Gulf of Mexico reached an all-time high.” But the Energy Department’s Energy Information Administration (EIA) reported that production in the Gulf is in decline, forecasting a decline of 250,000 barrels a day from Gulf production. There was also confusion and outrage expressed when President Obama, during his trip to Brazil in March, announced that he wanted the U.S. to assist the Brazilian government “with technology and support” to help develop its oil reserves, and that “we want to be one of your best customers.” This at a time when we are limiting our own drilling and pledging to reduce our dependence on foreign oil.In the meantime, natural gas could become a big part of the solution. A recent report by the EIA says that “The development of shale gas has become a ‘game changer’ for the U.S. natural gas market.” It says that the U.S. has “technically recoverable” shale gas resources estimated at 862 trillion cubic feet. Already, many trucks and buses in this country operate on natural gas, but the infrastructure to use them in cars is not there. A shift to natural gas could end our dependence on Middle East oil, which would stop our funding of terrorists around the world. Plus, it burns clean, thus having the added advantage of comforting the global warming alarmists. In addition, there is an estimated 800 billion barrels of oil locked up in shale in Wyoming, Utah and Colorado. These shale reserves are triple the proven reserves of Saudi Arabia , showing Obama’s press-conference claim that the U.S. has only 2% of the world’s oil to be blatantly false.Obama’s ideologically driven energy policy is in tatters, and the media can’t seem to help this time. It is time that he pursues a policy that will truly get America off of Middle Eastern oil, bolster the economy, and right the American ship of state.Copyright ©2011 Roger AronoffRead more by Roger AronoffTweetSend us and/or the Author your comments and questions about this article.
Roger Aronoff is the Editor of Accuracy in Media, and a member of the Citizens’ Commission on Benghazi. He can be contacted at [email protected]. Home Current Issue About Us Cartoons Submissions Subscribe Contact Links Humor Archive Login Please send any comments, web site suggestions, or problem reports to [email protected] | 科技 |
2016-40/3982/en_head.json.gz/15239 | All the Comforts of Home On the Top of the World
Rest and Research By
Peter N. Spotts, Staff writer of The Christian Science Monitor
SHEBA ICE STATION, ARCTIC OCEAN
— To hear some veterans of past Arctic expeditions tell it, creature comforts on the ice have never been this pleasant.Gone are cots or bunks shoehorned inside cramped experiment tents, warmed by oil-burning heaters the size of a kitchen trash compactor.Gone are worries that you'll wake up one morning to find that the ice has opened up, separating your sleeping quarters from the rest of the camp.
Instead, researchers and their support team enjoy warm cabins on a Canadian icebreaker, hot showers on demand, and a growing camaraderie with the French-Canadian crew. Not only that, the ship's cooks can whip up a fettuccini with seafood sauce that can make the chilling memories of a long day of ice floes and subzero temperatures melt away.
"This is the most comfortable Arctic expedition I've ever been on," says Bill Bosworth, a physical science technician with the US Army Corps of Engineers Cold Regions Research and Engineering Laboratory (CRREL) in Hanover, N.H., who is here to help set up ice-measuring equipment for a science team from the lab.The scientists are here for the Surface Heat Budget of the Arctic Ocean (SHEBA) project - a 13-month effort to study the transfer of heat between the ocean, ice, and atmosphere in order to improve climate-forecasting models. And during its planning stages, researchers debated whether to use an icebreaker as a base or set up a more-traditional camp in a cluster of temporary huts."We wanted an experiment that measured an annual cycle," says Richard Moritz, the SHEBA's director. "The only way to do that is to start in the fall when ice is building and follow through from there."That means working from a "platform" that can go the distance. In the end, the icebreaker won out because it had the capability to support 50 scientists and their equipment, and it could provide added security should the ice around the camp break up.A polar hotelThus, the ship Des Groseilliers has become the 322-foot-long, Swiss Army knife of Arctic hostelries. The ship supplies electricity for experiments in the huts on the ice, serves as a platform for some of the atmospheric remote-sensing gear, provides communications links with the mainland, and even acts as a Paris of the Arctic - a place for researchers to pick up a smattering of French phrases in their spare time.SHEBA's takeover of the ship is extensive. The hobby room houses a biology lab, and a section of the ship's propulsion area has become the headquarters for a team launching weather balloons. The officers' ward room and dining area have been turned into the project's data processing center and study, while the officers' pantry has become another lab for biology researchers. A plastic rat, tucked into one of the biologists' equipment boxes as a practical joke, sits atop a counter."Actually, I like this arrangement," says the Des Groseilliers's chief engineer, Robert Grimard, of SHEBA's takeover of the officers' areas. Instead of having their food come up from the galley in a dumb waiter, the officers now stand in a cafeteria line like everyone else. "Now we actually get to see the choices of food we're going to eat before we make them," he grins.Staying in touchAlthough "snail mail" service awaits the arrival of the Twin Otter aircraft from Deadhorse, Alaska, the scientists and crew have other means of getting news and reports from the folks at home. A satellite provides phone and fax links, and the stereo in the crew's lounge readily picks up the Barrow radio station, which carries news.And, when it works, there's e-mail via satellite, which will be the researchers' prime means of communicating with the hardy few who will rotate in and out of SHEBA camp during the winter.At the moment, little attention is focused on entertainment; People are working hard to see that their equipment is installed and working properly and instructions are written for those who will tend the gear after most researchers leave around Nov. 1.For those who stay behind, the ship has a small library of books in English and French. The data processing room also has movies on video, as well as video games. Even now, a fierce competition is building among some members of the contingent from CRREL to see who can capture the Tetris title.
Icebreaker envy? Why Obama wants more icebreakers in the Arctic (+video)
Why President Obama wants more Arctic icebreakers
China pushes for Arctic foothold, from a thousand miles away | 科技 |
2016-40/3982/en_head.json.gz/15291 | AMD to Invest $5.8B USD into Dresden Fabs
Tuan Nguyen (Blog) - April 5, 2006 6:07 PM
34 comment(s) - last by smilingcrow.. on Apr 14 at 12:25 PM
AMD prepares for a major ramp up this year leading all the way through to 2008
AMD's position as a force to be reckoned is no longer under question. The company has invested significantly over the last several years and made very good bets on its strategy that are now bringing fruits to the effort that was put in.We previously reported earlier in the year that AMD was putting in a rough sum of $500M USD to upgrade Fab 36 in Germany. The money was laid out to upgrade the facility to produce larger wafers with smaller lithography processes. Since then, Fab 36 has been operating at a good pace and is now producing a good amount of revenue.Despite the initial investment, competition often sparks innovation, and innovation doesn't stand still. According to German reports (english), AMD is in the process of rounding up another $5.8B USD which is to be spent gradually up to 2008. The company is using the money to further invest into its Dresden location. If this plan goes through successfully, AMD would have made one of its largest ever expansions in its history -- or for that matter, any semiconductor company.The money is initially slated for Fab 36 and Fab 30 in Dresden, but the German report indicates much of the earmarked money will also go towards a third AMD foundry in the Dresden area, with work beginning this summer. Some of the funds will also be used for employee salaries at the new and upgraded facilities. The year so far has been exciting, with many announcements from both sides of the camp. Thanks to first impressions of Intel's upcoming Conroe processor, many analysts are now looking at AMD with a great deal of anticipation to see what the company will come out with next, specifically the K8L architecture. AMD is currently on track in terms of timely product releases and expects to deliver quad-core desktop solutions in early 2007. The company recently broke its own 3GHz barrier with its Opteron family.
RE: gradually?
PandaBear
Consider many equipment are over 20 million each (and you need a whole bunch), and the cleanroom cost over $1k/USD per sq ft, you can spend that kind of money very quickly. Parent
Opteron Hits 3.0GHz March 30, 2006, 1:53 PM
Quad Core Conroe Successor Q1’07
AMD Says 65nm AMD Products Coming Soon | 科技 |
2016-40/3982/en_head.json.gz/15293 | 99 comment(s) - last by delphinus100.. on Mar 19 at 10:46 PM
Two plants near Tokyo, each with multiple reactors are on the verge of meltdown after emergency backup cooling was shut down by loss of power due to flooding. (Source: CNN)
An explosion damaged the roof of one plant, releasing radiation on Saturday. (Source: Reuters)
The plants lie within the Tokyo metropolis. People are being evacuated from within a 20 km radius. (Source: CNN)
Japanese nuclear disaster is cause for pause, reflection
The Sendai Earthquake struck Japan early Friday morning with unrelenting fury. Measuring 8.9 to 9.1 Mw-megathrust the quake was among the five most severe in recorded history and the worst quake to hit Japan. In the aftermath of this severe disaster, as the nation searches for survivors and contemplates rebuilding, an intriguing and alarming storyline has emerged -- the crisis at the Fukushima Daiichi nuclear plant.You may recall that a few years back Japan was struck by another quake which cracked the concrete foundation of a nuclear plant, but yielded virtually no damage.By contrast, this time the damage was far worse, creating what could legitimately be called a nuclear disaster.I. Fukushima Daiichi - a Veteran InstallationThe Tokyo district of Fukushima is home to two major nuclear power installations. In the north there is the Fukushima "Daini" II plant, which features four reactors -- the first of which went online in 1982. These units produce a maximum of 4.4 GW of power and are operated by the Tokyo Electric Power Company.To the south lies the Fukushima "Daiichi" I plant, a larger and older installation featuring six reactors, the first of which went online in 1970. Operated by Tepco, the installation offers a combined 4.7 GW of power. It was here that disaster struck.II. Disaster at Fukushima Daiichi 1While the southern installation is over four decades old, Japan has been responsible in retrofitting the plant with modern safeguards. Among those is an automatic switch which shuts off the reactor when an earthquake struck.The switch performed perfectly when the quake hit Friday morning, shutting of the three reactors that were active at the time. Control rods lowered and the reaction stopped.The next step was the cooling the power rods, composed of uranium-235, to prevent them from melting.Cooling water was pumped over the rods for about an hour, but before the rods could be fully cooled, stopping the reaction, the pumps failed. According to the International Atomic Energy Agency, and multinational oversight group, the failure was due to failure in the backup generators due to the tsunami flooding.On Saturday Japanese authorities and power officials tried to use sea-water injections to complete the cooling process, but those plans were stalled when another tsunami warning arrived.An explosion occurred inside at least one of the reactor buildings. It is believed to be due to the build-up of pressure after the pumps failed creating hydrogen and oxygen gases, which subsequently combusted from the heat.Malcolm Grimston, Associate Fellow for Energy, Environment and Development at London's Chatham House told CNN:Because they lost power to the water cooling system, they needed to vent the pressure that's building up inside.My suspicion is that as the temperature inside the reactor was rising, some of the metal cans that surround the fuel may have burst and at high temperature, that fuel cladding can react with water to produce zirconium oxide and hydrogen.That hydrogen then will be part of the gases that need to be vented. That hydrogen then mixes with the surrounding air. Hydrogen and oxygen can then recombine explosively. So it seems while the explosion wasn't directly connected with the nuclear processes, it was indirectly connected, because the hydrogen was only present because of what was going on in the reactor core.The explosion damaged the roof of the plant and sent billows of smoke up into the air. According to officials some radioactive material was released into the atmosphere. Outside the plant perimeter, levels of radiation measured 8 times higher than normal.Meanwhile reactors at the newer Fukushima II are also beginning to heat up after their own cooling systems failed.Japanese officials have evacuated people from an expanding radius around the plants as a precaution. Currently the evacuation zone is at almost 20 km. They hope to try to continue cooling, but have to work around tsunami alarms from earthquake aftershocks that have continued into Saturday. U.S. Secretary of State Hillary Clinton has announced that the U.S. is sending high-tech coolant to the plants, in and attempt to avert disaster.III. Can a Meltdown be Avoided?Without proper cooling, the rods will continue to heat and proceed towards meltdown, releasing clouds of radioactive gas. The first question is thus whether meltdown can be avoided.At the Fukushima I plant, radioactive cesium was discovered. Cesium is in the beta decay chain tellurium -> iodine -> xenon -> cesium. Its occurs roughly 16 hours after an unchecked uranium reaction and its presence indicates that one of the fuel rods may already have melted down.Once one rod melts, it will be much more difficult to prevent the others from melting down as well.According to reports, the coolant temperatures inside the reactor have exceeded 100 degrees Celsius. If they reach 540 degrees Celsius the fuel rods will fully melt down.The question now becomes what to do. According to reports by Nippon Hoso Kyokai (Japan Broadcasting Corporation), three individuals have already been exposed been the victims of radiation poisoning (likely plant workers) and that radioactivity levels at the plant have risen to 1,000 times the normal levels at the plant control room.One option on the table is to vent the reactors, allowing them to blow off the steam and prevent a greater buildup of pressure and heat. However, doing so could release significant levels of radioactivity into the surrounding area.The alternative is to try last ditch cooling and hope that if the rods do melt, that the secondary containment will hold. The release of radioactive gases from venting would pale to that if the secondary containment was breached. Such a scenario would likely result in the modern day equivalent of Chernobyl.Whichever course of action is selected, there's a great deal of risk of radiation exposure to those who inhabit the area in the near future. States James Acton, international physicist, in an interview with CNN, "There's a possibility of cancer in the long term -- that's the main hazard here."IV. Grim Lessons From the DisasterAt Three Mile Island, the U.S. learned the hard way not to put vital controls in the hands of plant operators. Operators almost created a meltdown, when they accidentally disabled necessary cooling. That was due to the poor quality of indicators. As the result, the nuclear community learned to automate shutdown processes.Ultimately the Fukushima disaster illustrates the need for sealed backup generators. The containment procedures in all their modern glory are useless if the backup power goes out. And, if possible, it shows that it is desirable to build new nuclear plants farther from the sea and from fault lines (though this could cause costs to increase).As the fight to avert meltdown plays out, the final damage won't be known for weeks to come. But the international community is already reacting.At this time it's vital not to overreact to this worse case scenario. The disaster does illustrate that nuclear fission power is far from failsafe, particularly older reactors -- even if retrofitted with modern controls. Ultimately the international community needs to work towards fusion power, which should be much safer and cheaper.At the same time, it's important to consider that there's a great deal of background radiation released from the burning of fossilized coal and that mining fossil fuels has led to many a great loss of life and resources as illustrated by recent coal and oil disasters.And nuclear power is far less expensive than solar or wind power in base costs, and generally less expensive even after all the red tape that increases plant creation costs by an order of magnitude in the U.S.There's no easy answers here. Oil and coal power emit dangerous nitrogen and sulfur-containing gases and carbon dioxide into the ozone. And their fuel is dangerous to obtain. But they're cheap. Solar and wind power are relatively safe, but they're expensive and offer inconsistent power. Nuclear power is cheap and produces no emissions normally, but it can be a danger in the case of natural disaster or malicious attack.It's important not to turn a blind eye to this disaster, but it's equally important not to overreact.
RE: A bad placement decision on Japan's part....
vol7ron
No, I think there is normally a small circumference of higher radiation surrounding the plants, due to the extreme heat, though these levels are still considered non-fatal, much like the contamination from sitting at your desk (or living in a brick house).All-in-all, this is a very sad situation. I'm somewhat surprised as china and japan have done very well with earthquake-proofing their structures. Parent
Iran Say it Has Captured "Western Spies" Involved in Nuclear Cyberattack
Report Released by BP Reveals Causes of Gulf Oil Spill
Study: Chernobyl Nuclear Disaster Hurts Some Species More Than Others
The High Cost in Life of America's Coal Addiction, Nuclear Hatred
New Three Mile Island "Radiation Leak" is Mostly Overblown | 科技 |
2016-40/3982/en_head.json.gz/15294 | Jansen Ng (Blog) - November 22, 2011 12:43 PM
13 comment(s) - last by zodiacfml.. on Nov 24 at 7:44 AM
Mass production within 3-4 years
Electronics part supplier ROHM has announced that it has achieved a wireless data transmission speed of 1.5 gigabits per second in experiments conducted with Osaka University. The technology involved the use of terahertz frequencies at 300GHz, but current plans could result in speeds of up to 30Gbps.
Chances are you haven't heard of ROHM unless you are in the semiconductor industry. The electronic parts supplier based in Kyoto, Japan is one of the top 25 semiconductor firms in the world by sales.
ROHM's breakthrough involves the use of a new micro-antenna that integrates the oscillation device and the detection element onto the semiconductor baseplate. The company plans for mass production of the technology within the next 3-4 years. 30Gbps chips would require production using advanced lithographies, but would result in chip costs of less than $5.
The terahertz range goes from 100GHz to 10THz, and has wave-like and particle-like characteristics. Terahertz radiation is similar to microwave radiation and can penetrate most non-conductive materials, but is stopped by water and metallic substances. Its range is pretty short and requires line of sight, but development for medical and security scanners is already underway.
The most alluring use is in high speed radio transmission, possibly leapfrogging wired connections. Computer networks using terahertz technology could be a reality within this decade, but they will have to compete with the WiGig Alliance and their own 7GHz standard.
Metro Wireless
MGSsancho
Could be more affordable than Point-to-point microwave used for metro ethernet and wireless backhual "Mac OS X is like living in a farmhouse in the country with no locks, and Windows is living in a house with bars on the windows in the bad part of town." -- Charlie Miller
Vendors Edge Towards 1Gbps DSL Connections
Chattanooga, TN to Get Nation's Fastest Internet -- 1 Gbps
Wi-Fi Alliance, WiGig Continue to Recruit New Manufacturers | 科技 |
2016-40/3982/en_head.json.gz/15334 | Home > Apple > Facebook hosts meandering Q&A with George W… Facebook hosts meandering Q&A with George W. Bush By
In what was easily one of the more convoluted Facebook events, former President George W. Bush visited the social network’s headquarters for a Q&A session and to promote his book, Decision Points. Right off the bat, Facebook CEO Mark Zuckerberg attempted to lend a little clarity to the situation, asking Bush to explain why he chose to interview at Facebook. “I’m shamelessly marketing,” Bush said, and referenced Facebook’s incredibly dominant hold on the digital world.
Aside to what seemed like endless chuckling and good-natured ribbing between Bush and Zuckerberg, the Q&A focused primarily around the book, which Bush toted as an anecdotal narrative around the notable decisions he made during his presidency. He candidly spoke about the controversies he faced and adamantly defended his decisions as well, while also shedding light on some personal stories.
He also pointed out more than once than Zuckerberg doesn’t have a college degree, and likened the adversity the CEO faces to that surrounding his own administration’s.
Bush was asked about the latest WikiLeaks scandal, and said he believed those behind the link should be prosecuted. He also mentioned he believes the fallout will make it difficult for the US to keep the trust of foreign leaders. Bush compared the situation to one he faced with The New York Times post-9/11, when the publication was going to run information on the government tapping phone calls.
In a somewhat more relevant aside, Bush talked about his tech and gadget experience, citing his administration as “the first to use the e-mail” more heavily than previous presidencies, but that he shied away from the technology. Since he was legally required to archive emails, he didn’t want his words being interpreted incorrectly in the future. He reported that after leaving office, he was a BlackBerry user and since then has been won over by Apple and is a fan of the iPad. He also affirms he utilizes “the Facebook” and no longer uses an iPod.
All in all, the interview was an opportunity for Zuckerberg and Bush to praise each other and for the former president to push his new tome. If anything, the event was a little confusing as the moderator tried to tie Bush’s visit into social networking and digital media, but failed to establish any concrete links. Maybe Facebook is simply looking to expand, and the site’s live events are going to reach out to genres besides its own. | 科技 |
2016-40/3982/en_head.json.gz/15337 | Home > Computing > Microsoft applies for patent on 3D electromagnetic… Microsoft applies for patent on 3D electromagnetic stylus By
Anna Washenko
Microsoft has applied for a patent on a technology related to a 3D electromagnetic stylus. According to Tom’s Hardware, a stylus of this nature would work by recognizing specific configurations of electromagnetic fields in a display device. The fields would be created by a set of transmitting coils underneath the screen. The stylus’ presence in the electromagnetic field would be translated and sent to the display, which would be able to analyze the data and know the exact positioning of the pen. The principles at work are similar to those found in a 3D mouse, but Microsoft’s paperwork for the U.S. Patent and Trademark Office did not include any examples of specific uses for the 3D data input stylus. The most obvious use would be for collaborating on business teams in different locations. In that situation, the stylus would allow a remote employee to still be active in working and brainstorming with teammates. There isn’t as clear a need for casual users, but it’s entirely possible that Microsoft will be cooking up something amazing in the next few years with this technology.
Different takes on 3D tech have been receiving extra attention among the forward-thinking. Use of 3D in movies has finally become less of a craze, but 3D printing is a field continuing to make large and fascinating strides. And of course, this hypothetical stylus isn’t Microsoft’s first 3D rodeo either. The Kinect, which has the same principle of using 3D data input, was one of the earliest and most successful motion controllers. In the Kinect’s case, the sensors are picking up a human body instead of a stylus. So even though there aren’t any definite applications of a 3D stylus yet, you can bet that the creative people who can turn the Kinect into a Holodeck or a modern art piece will find fun things to do with it.
Image via hfng/Shutterstock | 科技 |
2016-40/3982/en_head.json.gz/15338 | Home > Gaming > Xbox 360 has sold over 76 million consoles, but… Xbox 360 has sold over 76 million consoles, but Kinect is slowing down By
2013 may well be the Xbox 360’s final year, and the end of a remarkable run for Microsoft in one of the most competitive fields of the gaming industry. An American game console hasn’t been a major global success since Nolan Bushnell put the Atari 2600 out into the world at the tail end of the 1970s. The Xbox 360, meanwhile, is approaching its eighth year on the market, a lifespan that is literally twice as long as its predecessor. Based on newly revealed data from Microsoft’s Yusuf Mehdi, the Xbox 360 won’t sit atop the sales record books in the gaming industry, but it has achieved some remarkable feats.
Speaking at the All Things D: Dive Into Media event this week, Mehdi disclosed that global Xbox 360 sales now total 76 million consoles. In September 2012, Microsoft revealed in its financial statements that the Xbox 360 had just broken 70 million console sales, meaning that the company moved around 6 million Xbox 360s over the holidays. An impressive feat for an eight-year-old machine, and twice the Nintendo Wii U’s sales over the same period. The console has maintained its lead in the United States month over month for two straight years as best-selling game console.
Microsoft’s been able to maintain that pace thanks in part to the motion control device Kinect. Microsoft’s now sold 24 million Kinect sensors worldwide. What’s particularly interesting about that figure is that, despite aggressively bundling the two together, the Kinect is not selling in tandem with the Xbox 360. As of November 2012, Microsoft said that it had only sold 20 million Kinect sensors to date, betraying slow growth for the device since global sales sat at 18 million in January 2012. The Xbox 360 was able to sell 6 million consoles over just a few months according to Microsoft’s own data, but it took Kinect to do the same volume over the course of a year.
What does all this data tell us about the future of the Xbox brand? While it’s too early to say that motion control isn’t a key component of Microsoft’s future in the gaming business—24 million peripheral sales in two years is still strong—the data does cement the direction for the Xbox brand broadly. Entertainment, not just games, is the future of Xbox. Entertainment app usage—Netflix, Hulu, Xbox Music, ESPN, etc—grew 57 percent year on year. The Xbox 720 will play high end games, no doubt, but the box will likely be the Trojan horse for Xbox TV. | 科技 |
2016-40/3982/en_head.json.gz/15339 | Home > Cool Tech > Panasonic to cut 17,000 employees Panasonic to cut 17,000 employees By
Panasonic, Japan’s largest makers of consumer electronics, has announced a broad restructuring plan that will see the company shrink itself from five major divisions down to three, as well as cut some 17,000 jobs. The moves are intended to make the company more competitive with electronics manufacturers in South Korea and China, but also reflect Panasonic’s efforts to cope with the disruptions and devastation generated by last month’s massive earthquake and resulting tsunami, which have significantly dampened demand in an already-sluggish Japanese market.
Panasonic currently employs about 367,000 people, so the 17,000 job cuts represent a roughly 4.5 percent reduction in the company’s workforce. The cuts seem primarily targeted at integrating two recent acquisitions—Sanyo Electric and Panasonic Electric Works—into the larger company. At the same time, Panasonic is going to reduce the number of major business divisions within the company from five areas based on technology platforms to three new sectors based on business models: Consumer, Components & Devices, and Solutions. The Consumer group will comprise climate control and home appliances, along with Panasonic’s AV, home entertainment, and networkable products. The Components & Devices group will handle energy, automotive, and (apparently) smartphones, while Solutions will comprise four primary businesses centers on health care systems, environment and energy systems, communications, and factory systems.
In the consumer sector, Panasonic says it plans to increase purchases of LCD panels from other manufacturers and increase its own overseas production to compete with the likes of Samsung and LG, and also plans to reorganize its semiconductor business to lower dependencies on large-scale integrated circuits from other manufacturers.
Panasonic estimates the restructuring will cost about $2 billion, but says the changes should increase annual revenues by over $700 million a year, particularly as the company ramps up sales of solar cells, lithium-ion batteries, climate control equipment, and LED lighting.
Nonetheless, Panasonic will face many challenges: the comparative strength of the Japanese Yen has made Panasonic products more expensive than products from competitors, particularly in overseas markets. | 科技 |
2016-40/3982/en_head.json.gz/15340 | Home > Mobile > Popular device leaker @evleaks calls it quits (to… Popular device leaker @evleaks calls it quits (to find a more stable job) By
Williams Pelegrin
After two years of leaking countless devices, the most recent being several press shots of the Moto 360, Evan Blass, known as @evleaks in every corner of the Internet, is calling it quits.
While his tweet revealing the news wasn’t at all specific, the main reason for his retiring was due to financial instability, according to Blass’ interview with The Next Web. He felt as if he needed a more traditional full-time job in order to not only provide for his family, but also fight his multiple sclerosis (MS), which was first revealed in a piece by The Verge which detailed Walgreen’s blacklisting of some drug buyers.
While Blass attempted to monetize his leaking ways through his website and sponsored tweets, the effort wasn’t a stable-enough form of payment for him to continue it. In addition, in the interview, Blass noted that his audience tended to use ad-block software, which prevented an even flow of revenue.
All good things must come to an end. Thank you for an amazing two years. [RETIREMENT]
— Evan Blass (@evleaks) August 3, 2014
While he noted that the decision is final, with the fate of his website up in the air, he did reveal that his well-known @evleaks Twitter handle will now be for personal use. Blass started off in 2005 as an editor for Engadget before working his way up to senior editor. He then had a three-year stint at Pocketnow as a managing editor before converting his leaking tendencies into a full-time gig in 2012.
As early as 2009, Twitter was a hotbed for leaks. Unfortunately, for these early leakers, legal threats from phone manufacturers and the high difficulty of publicizing leaks, as well as failure to monetize such a path, were enough to ward them away. What made Blass stand out from the rest was not only his success with the latter, but also how accurate his leaks were. This is why outlets such as Engadget, The Verge, and we here at Digital Trends aren’t hesitant on reporting whatever he leaks.
TK TechNews, a website run by T.K. Connor, launched an Indiegogo campaign to help fund Blass’ MS treatment. The campaign is asking for $100,000, with three people who give $500 getting a leaked phone and two people who give $1,500 getting box seats for a New Jersey Devils hockey game and the chance to meet up with the team. The campaign has raised $1,746 as of this writing.
Also watch: Apple iPhone 6S Plus vs. Samsung Galaxy S6 Edge | 科技 |
2016-40/3982/en_head.json.gz/15381 | Microsoft, Yahoo! Join Forces in Web Search Partnership
Two of the biggest names in the software industry, Microsoft and Yahoo!, have announced a 10-year partnership under which Microsoft's new Bing search engine will power Yahoo! online searches, and Microsoft's AdCenter ad sales platform will replace Yahoo!'s Panama. In return, Yahoo! will license its search technology to Microsoft and sell search ads on both Yahoo! and Bing.The deal is designed to pool the two companies' resources and strengths in order to better compete with market leader Google. In short, Microsoft becomes the technology provider and Yahoo! sells the advertising space. Yahoo! stands to save millions of dollars in technology costs while still keeping 88% of its ad revenue-a significantly higher percentage than usual for such a deal.IDC analysts Karsten Weide and Susan Feldman issued a statement in which they write: "On the face of it, this is a good deal for Yahoo!. To begin with, it would keep the majority of its search ad revenue, and pay only 12% of it for Microsoft's tech services. Two, it would save millions of dollars on data centers, servers, mass storage and telecommunications costs, as well as on an army of expensive search engineers. Most likely, the deal will boost profitability: only slightly lower revenue, but great cost savings. Thirdly, adding Microsoft Bing's search traffic market share to Yahoo!'s would increase its market share from 20% to a combined share of 28% (compared to Google's 65%), making it a mildly more attractive offer to advertising clients, especially for those that need massive amounts of inventory. Finally, this will allow Yahoo! to focus on what it is good at: media and advertising; and to drop what is not so good at, technology. More focus on sales and a more attractive search ad offer could increase the combined revenue, thereby improving both companies' top line and making them more competitive."In addition to the long-term strategic concerns for Yahoo!, the deal is being investigated by Department of Justice on antitrust grounds. However, Competitive Enterprise Institute analyst Ryan Young thinks that the justice department "should leave the Microsoft-Yahoo search deal alone.""What is there to investigate? Microsoft and Yahoo are trying to outcompete Google," Young says. "To succeed, they will need to put together the best search engine they can. The firms believe their announced partnership will help them achieve that goal. They should be allowed to try--their own money is at stake if they fail. Either way, Internet users stand to benefit."
(www.microsoft.com; www.yahoo.com; http://cei.org; www.idc.com) | 科技 |
2016-40/3982/en_head.json.gz/15382 | Simulmedia Launched Open Access Project May 07, 2013
Simulmedia, a provider of audience-based television ad targeting, announced it has launched Open Access Project, an initiative that will allow free access on its website to key parts of its database, insights, and platform tools. This includes access to reach, frequency, and cost performance data on recent national TV campaigns from more than 100 national advertisers.This is a beta release and will be expanded over the next month to include more than 300 national advertisers and two years of campaigns. Within the searchable database at launch will be actual reach and frequency charts for each campaign by month and by different target audiences, the share of voice comparisons for each market competitor, the reach contributions by each TV network on the campaign, as well as the cost-per-reach points for each network.
(www.simulmedia.com) | 科技 |
2016-40/3982/en_head.json.gz/15421 | The Last Of Us Remastered Now Available For Pre-Load In Europe
The Last Of Us Remastered, the PlayStation 4 version of the highly successful game developed by Naughty Dog, is finally available in North America today and it finally become available in Europe in the next few days.
Online reports have confirmed that European gamers who have pre-ordered the game digitally on PSN can now pre-load the game so that they will be able to play it as soon as the official release date comes in. The download is going to be quite big, with 8GB alone for the multiplayer portion of the game so you better start downloading as quickly as possible so that you can enjoy the game as soon as the release date comes.
Even though the game has only been released officially today in North America, some lucky gamers have been able to get their hands on The Last Of Us Remastered last week. A new The Last Of Us Remastered unboxing video has been made available last week, which also confirmed that the game requires 50GB of free HDD space.
The Last Of Us Remastered is now available in North America exclusively on PlayStation 4. The game will be released later this week in Europe. | 科技 |
2016-40/3982/en_head.json.gz/15483 | uReport Weather
As Coastal Populations Grow, So Do Hurricane Risks
Print Populations along the coasts are growing exponentially, which could mean problems when it comes to hurricane season.
According to 2012 United States Census Bureau data, the nine states with the highest population density are all located along the East Coast. NOAA's State of the Coast research indicates that coastal populations will continue to grow at a faster rate than the rest of the country, with an expected increase of 37 people per square mile for shoreline counties and only an 11 person per square mile increase for the United States as a whole. From 1970 until 2010, coastal populations have risen by 39 percent.
Part of what draws people to coastal cities is rooted in history. When the country was forming, cities were built closer to shores for easy access to shipping routes. Over time, a cycle was created: with more people on the coasts, more development was needed and more jobs were created, which led to more people moving to coasts to get those jobs, which led to further development. As a result, some of the country's biggest centers for trade and industry are located in densely populated East Coast cities. New York City has a population of 8,336,697, with 27,543.1 people per square mile.
Some residents are also drawn to coasts for leisure aspects, such as vacationers purchasing summer homes on beaches or water enthusiasts building their permanent residencies in shoreline communities.
The numbers of people are expanding, but the size of the land is not, and the imbalance wears on resources. Aquatic wildlife populations are affected as building and expansion disrupt their natural habitats. Air pollution increases as more traffic is brought to a more concentrated area. The shores themselves face higher rates of erosion.
While some erosion occurs naturally, waves often slowly replace the sands, they initially pulled away. However, when erosion occurs at an accelerated rate as it does in highly populated areas, the rate of natural replacement is not enough to keep up.
Beach erosion increases the risks when hurricanes hit. Beach elevations are lowered, making them more prone to flooding. Dune erosion diminishes the protective barrier that dunes offer coastal homes, which makes them more susceptible to higher damages. Then, when a strong storm does come through, it erodes beaches even further, making the towns and cities even more vulnerable to future weather effects.
To combat the issue, many coastal governments are implementing building regulations. Permits may be required that demand a reinforcing structure is created before building near beaches. The most common of these are seawalls, walls built out of stone or concrete to hold back bluffs or sand dunes from waves, and groins, which are similar to jetties, but serve a different purpose. Groins are set up to try to "trap" sand that is being pulled by the ocean.
There are problems with these solutions, however. Seawalls protect the land behind them, but increase erosion to the parts of the beach that lay before them. Waters hit the wall and are sent back at a more rapid rate than if they were to creep up the beach then roll away. Groins, when properly constructed, may help to an extent, but they cannot hold all sediment in place.
Another way to prevent accelerated erosion is to decrease beach and water pollution. Coral reefs are natural protectors against rapid erosion, but they are delicate and extremely sensitive to changes in water acidity and temperature. By maintaining the natural state of the ocean, coral reefs remain healthy and alive to control the levels of sand that are washed away.
Trending in Weather
Chaba to threaten Japan as a strong typhoon next week
Hurricane Matthew to lash Caribbean with flooding, damaging winds
See allTrends | 科技 |
2016-40/3982/en_head.json.gz/15500 | Home Screens The Conduit
The Conduit Screenshots
Hoffman Estates, IL – June 10, 2008 - High Voltage Software, Inc., one of the world's largest independent developers, today officially announced The Conduit, a stunning first-person action shooter that promises to provide some of the best visuals seen to date on Nintendo's Wii gaming console, for Q1 2009 release. Using their proprietary Quantum 3 Game Engine technology, the team at High Voltage Software is fusing fast and fluid gameplay with an advanced set of visuals intended to squarely establish the Wii as a true next-generation gaming platform. The first-person shooter boasts a competitive online multiplayer component and a thrilling single-player campaign. Advanced graphical features include dynamic environment mapping, interactive water with real-time reflection, and four-stage texture composition including gloss, diffuse, and bump mapping. "Too many Wii owners have been told that the Wii is a casual platform with no room for serious games or top-tier graphics," said Eric Nofsinger, Chief Creative Officer at High Voltage Software. "With The Conduit, we intend to prove that theory wrong by providing gamers with the kind of title they imagined back when the platform was first announced." Featuring terrifying alien creatures, an advanced Artificial Intelligence System, and a rich, conspiracy-laden story, The Conduit has more than a pretty face. "The Wii has tremendous potential," said Kerry J. Ganofsky, CEO and founder of High Voltage Software. "We've invested heavily in our technology to make certain that potential is realized with The Conduit." The Conduit's control scheme was carefully designed from the ground up with the Wii in mind, as were each of the weapons in the extensive armory, providing gameplay not possible on other platforms. As Secret Service Agent Ford, you must learn to master each of these weapons as well as the special "All Seeing Eye" device to defeat enemies, solve puzzles, and unravel the game's mysteries. "Conduit was especially designed for the Nintendo Wii" states Matt Corso, Creative Director at High Voltage Software. "From the weapons to armory and graphics, we tailored everything to the Wii which sets this game play apart from anything else on the market right now." Publisher excitement and interest in The Conduit has been overwhelming. High Voltage Software will continue to search for the perfect partner to ensure The Conduit’s success. The Conduit is tentatively scheduled for release in Q1 2009 exclusively for the Nintendo Wii. Content on this page comes directly from press releases and fact sheets provided by publishers and developers and was not written by the Game Revolution staff.
More information about The Conduit | 科技 |
2016-40/3982/en_head.json.gz/15566 | Home » XRite Promotes Vacchiano To CEO
XRite Promotes Vacchiano To CEO
| By Anne Bond Emrich |
XRite Sees Net Loss
X-Rite Amazys Deal Closer
Cook Resigns Ferrara Leads XRite
Grow Your Business Online
TEC (The Executive Committee) Meet and Greet
GRANDVILLE — Michael C. Ferrara will retire as CEO of X-Rite Inc. and give up his seat on its board of directors on Oct. 1, but he will remain as a consultant through the end of the year as the company transitions to the leadership of succeeding X-Rite President and CEO Thomas J. Vacchiano. As part and parcel to the promotion, Vacchiano will become a member of X-Rite’s board.
Vacchiano was formerly president and CEO of Amazys Holding AG, the Swiss color technology company known for designing, developing and marketing color management solutions under the GretagMacbeth brand. Vacchiano served in that capacity from early 2001 up until X-Rite’s acquisition of Amazys this past July. He assumed the role of president and COO of X-Rite right after the deal closed.
In a three-year transaction agreement negotiated with Vacchiano in January, X-Rite stated its intention to promote him to the position of CEO within 18 months of completing the purchase of Amazys.
At the same time, the company modified provisions of its employment agreement with <?xml:namespace prefix = st1 ns = "urn:schemas-microsoft-com:office:smarttags" />Ferrara, which extended to Dec. 31, 2008. In an amendment to the employment contract, both parties agreed that the company could terminate Ferrara’s role as CEO at any time prior to Dec. 31, 2008, simply by giving written notice as to the effective date of termination.
Vacchiano doesn’t find it that surprising that he was promoted to CEO just three months after the Amazys acquisition was finalized.
“The interesting thing is that we really kind of think of it as eight or nine months later, because we’ve been working together on the integration since Jan. 31,” he remarked. “We’ve been so active from February through September that it feels a lot longer, so it feels pretty comfortable.”
X-Rite Chairman John Utley credited Ferrara with refocusing the company on the color business. Ferrara said his past five years with X-Rite have been the most rewarding and challenging of his career.
“With the integration of the two companies firmly on track, I am very optimistic about the future of the company under Tom’s leadership and the opportunity for increased shareholder value,” Ferrara said.
Prior to heading up Amazys, he was president and CEO of Xerox Engineering Systems, a Xerox subsidiary that develops, markets and supports hardware, software and services for the engineering and graphic arts market. Before that, he spent four years as vice president of sales operations and planning for Digital Equipment Corp., an American pioneer in the computer industry. Prior to his stint as Digital Equipment Corp., Vacchiano served 17 years in positions in finance, sales, marketing and general management for NCR, a technology company specializing in solutions for the retail and financial industries. His last position with NCR was as vice president of marketing for the company’s Personal Computer Business division.
“His extensive experience in technology companies together with his current track record of creating exceptional value for Amazys shareholders over the last several years is impressive,” Utley stated. “This board believes that his vision and leadership are well aligned with the challenges of leading X-Rite to the next level of performance.”
Vacchiano believes that his nearly 28 years in the field of business-to-business, technology-based solutions have prepared him well to lead X-Rite. He said he has spent the last five and a half years successfully leading Amazys — which was basically the fastest growth company in the industry and a very profitable one — and he’s no newcomer to the industry, so that should be “very, very helpful” to him going forward with X-Rite. “To be honest, I have competed very respectfully against X-Rite, and I know X-Rite’s strengths and weaknesses from a competitive standpoint, and I think that’s going to be really helpful now that we’re all one company.”
As far as his leadership style, Vacchiano describes himself as “one part strategist, one part team builder and one part operator.”
“I think that three-part blend is probably pretty accurate in terms of describing my background and what has helped me be successful over the years,” he added. “You need to have some sort of view or vision about where the industry and company need to go. You need to have an ability to build a really high-performance team — folks that like being with the company that are motivated and talented. Third, the operator part is making sure that you can execute well for your customers, for your partners and for your other stakeholders. I think you need to constantly do better every day, and if you can get those three things in pretty good shape, usually your company is successful.”
He said when he takes over the reins of the company Oct. 1, his No. 1 priority is “to implement a very successful integration of the two companies, which is a job X-Rite will be working on for the next several months.”
As a condition of his employment as CEO of X-Rite, Vacchiano must establish his principal residence in Michigan. He, his wife of 22 years and their two children have been living the past eight years in New Canaan, Conn. He has been commuting to X-Rite since being named president and COO July 5.
Vacchiano is originally from Dayton, Ohio, and his wife is from Evansville, Ind. The family will move to Michigan either at the end of this year or early next year, he said. Anne Bond Emrich
Recent Articles by Anne Bond Emrich
QST Experiences Five-Year Growth Spurt
Medical Mistrust High Among Minority Women Brandmotion targets auto 'real estate' | 科技 |
2016-40/3982/en_head.json.gz/15575 | Climate Change 2001:
Working Group I: The Scientific Basis
Get Javascript Other reports in this collection
The Intergovernmental Panel on Climate Change (IPCC) was jointly established by the World Meteorological Organization (WMO) and the United Nations Environment Programme (UNEP) in 1988. Its terms of reference include (i) to assess available scientific and socio-economic information on climate change and its impacts and on the options for mitigating climate change and adapting to it and (ii) to provide, on request, scientific/technical/socio-economic advice to the Conference of the Parties (COP) to the United Nations Framework Convention on Climate Change (UNFCCC). From 1990, the IPCC has produced a series of Assessment Reports, Special Reports, Technical Papers, methodologies and other products that have become standard works of reference, widely used by policymakers, scientists and other experts.
This volume, which forms part of the Third Assessment Report (TAR), has been produced by Working Group I (WGI) of the IPCC and focuses on the science of climate change. It consists of 14 chapters covering the physical climate system, the factors that drive climate change, analyses of past climate and projections of future climate change, and detection and attribution of human influences on recent climate. As is usual in the IPCC, success in producing this report has depended first and foremost on the knowledge, enthusiasm and co-operation of many hundreds of experts worldwide, in many related but different disciplines. We would like to express our gratitude to all the Co-ordinating Lead Authors, Lead Authors, Contributing Authors, Review Editors and Reviewers. These individuals have devoted enormous time and effort to produce this report and we are extremely grateful for their commitment to the IPCC process. We would like to thank the staff of the WGI Technical Support Unit and the IPCC Secretariat for their dedication in co-ordinating the production of another successful IPCC report. We are also grateful to the governments, who have supported their scientists' participation in the IPCC process and who have contributed to the IPCC Trust Fund to provide for the essential participation of experts from developing countries and countries with economies in transition. We would like to express our appreciation to the governments of France, Tanzania, New Zealand and Canada who hosted drafting sessions in their countries, to the government of China, who hosted the final session of Working Group I in Shanghai, and to the government of the United Kingdom, who funded the WGI Technical Support Unit.
We would particularly like to thank Dr Robert Watson, Chairman of the IPCC, for his sound direction and tireless and able guidance of the IPCC, and Sir John Houghton and Prof. Ding Yihui, the Co-Chairmen of Working Group I, for their skillful leadership of Working Group I through the production of this report.
G.O.P. Obasi
World Meteorological Organization K. Töpfer
United Nations Environment Programme and Director-General United Nations Office in Nairobi | 科技 |
2016-40/3982/en_head.json.gz/15641 | Cardboard Bicycle Fund-Raising Campaign Fails (By $1.95 Million) (VIDEO)
Michael Rundle
HuffPost UK Technology Editor
The inventors of a cardboard bicycle have pulled their campaign to raise more than $2 million to develop a sustainable production line for their unusual method of transport.
The Cardboard Bike - made of recycled materials - was intended to "change the world" by providing a new, clean-energy use for, well, garbage.
The inventors had claimed it could be a transformative invention.
"Imagine a time when every plastic or cardboard product that is thrown into the recycle bin will contribute to the creation of a bicycle, wheelchair or toy," said Israeli engineer and inventor of the cardboard bike, Izhar Gafni, in a press release.
"Basically the idea is like Japanese origami, but we don't compress the cardboard and we don't break its structure. We overcome the cardboard's failure points, by spreading out the weight to create durability."
But after the campaign managed to raise just $41,000, the plan has now been quietly shelved for now.
"We listened to you, our greatest supporters, and will re-open our Indiegogo campaign once the factory is up and running and can offer lower prices and a three-month delivery time. That is the beauty of crowdfunding," said the team on Indiegogo.
So why did it fail? In short, because no one bought the bikes. Only 24 pledges of $290 were received by July 1, which would have guaranteed the backers a bike of their own. After the bike was reduced to $135 more backers came on board - but only another $17,000 worth.
Over at Triple Pundit, Rad Godelnik, co-founder of Eco-Libris, writes: "The thing is that the market value isn't as important for many people as the value they perceive from the gift. That's where the cardboard bike failed - it just didn't generate enough value to justify the $290 price tag. And ultimately customers weren't interested in subsidising the cost of the factory."
As for the cardboard bike? The team behind it are now looking to develop the idea and factory on their own, before returning to crowd-funding for another go on the merry-go-round.
Best Fitness Gadgets
Jawbone UP3
Since its first product, Jawbone has been slowly but steadily creating the ultimate 'lifestyle' fitness tracker. Blending good looks with genuine functionality the UP3 is an activity tracker, fitness tracker and heart health monitor. All packed into a body that's water-resistant and has a week of battery life.
Cardboard Bike Indiegogo Uk News Kickstarter Israel | 科技 |
2016-40/3982/en_head.json.gz/15765 | MISR Views the Big Island of Hawaii
MISR Sees the Sierra Nevadas in Stereo
MISR Stereo Imaging Distinguishes Smoke from Cloud These views of western Alaska were acquired by MISR on June 25, 2000 during Terra orbit 2775. The images cover an area of about 150 kilometers x 225 kilometers, and have been oriented with north to the left. The left image is from the vertical-viewing (nadir) camera, whereas the right image is a stereo "anaglyph" that combines data from the forward-viewing 45-degree and 60-degree cameras. This image appears three-dimensional when viewed through red/blue glasses with the red filter over the left eye. It may help to darken the room lights when viewing the image on a computer screen.The Yukon River is seen wending its way from upper left to lower right. A forest fire in the Kaiyuh Mountains produced the long smoke plume that originates below and to the right of image center. In the nadir view, the high cirrus clouds at the top of the image and the smoke plume are similar in appearance, and the lack of vertical information makes them hard to differentiate. Viewing the righthand image with stereo glasses, on the other hand, demonstrates that the scene consists of several vertically-stratified layers, including the surface terrain, the smoke, some scattered cumulus clouds, and streaks of high, thin cirrus. This added dimensionality is one of the ways MISR data helps scientists identify and classify various components of terrestrial scenes.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology. View all Images
Mission: Earth Observing System (EOS)Target: EarthSpacecraft: TerraInstrument: Multi-angle Imaging SpectroRadiometer (MISR)Views: 3,315
Full-Res JPG: PIA02625.jpg (0.14 MB)
Image credit: NASA/GSFC/JPL, MISR Team
Heron Island is located in Queensland, Australia, off the Australian mainland. The island is famous for diving and snorkeling and is a World Heritage-Listed Marine National Park. NASA's Terra spacecraft acquired this image on December 22, 2001.
A Bird's Eye View of Australia's Heron Island
The first Norse settlement of Greenland was at Brattahlid (now Qassiarsuk), as shown in this image captured by NASA's Terra spacecraft.
Qassiarsuk, Greenland
On Sept. 14, 2016, the eye of Super Typhoon Meranti passed just south of Taiwan as NASA's Terra spacecraft passed directly over the eye.
Huge Super Typhoon Meranti Over Taiwan Spotted by NASA's MISR | 科技 |
2016-40/3982/en_head.json.gz/15790 | Hot Online: Report says new iPhone coming in August
A report from AppleInsider says that two new iPhones will be coming out in August. The iPhone 5S will hit the markets in the U.S., and a cheaper version of the iPhone will hit Chinese markets.
A student at the University of Delaware hit the ATM jackpot when it gave him $1,800. The student hadn’t even entered his card into the ATM when it spit out the cash. He decided to do the right thing and contacted authorities to give back the money.
Hot video: ‘Modern Family’ cast members stuck on elevator
Three members of the cast of ABC's "Modern Family" were stuck in an elevator for more than an hour in a hotel in Kansas City, Mo., this weekend. Julie Bowen, Eric Stonestreet and Tyler Ferguson shared all the details of their unfortunate situation on the social media websites Twitter and Vine. Kristina Behr has the story. | 科技 |
2016-40/3982/en_head.json.gz/15797 | Possible fossils found in meteorite fragments?
Well, here’s something potentially interesting: researchers at Cardiff University think they have found fossils in meteorite fragments from Sri Lanka.
The most startling claims, however, are based on electron microscope images of structures within the stones (see above). Wallis and co say that one image shows a complex, thick-walled, carbon-rich microfossil about 100 micrometres across that bares similarities with a group of largely extinct marine dinoflagellate algae.
They say another image shows well-preserved flagella that are 2 micrometres in diameter and 100 micrometres long. By terrestrial standards, that’s extremely long and thin, which Wallis and co interpret as evidence of formation in a low-gravity, low-pressure environment.
Gotta take this with a massive grain of salt, but it will be interesting to see how this one plays out.
Update: One of the authors of this study holds some unusual views about life on Earth.
On May 24, 2003 The Lancet published a letter from Wickramasinghe, jointly signed by Milton Wainwright and Jayant Narlikar, in which they hypothesized that the virus that causes Severe Acute Respiratory Syndrome (SARS) could be extraterrestrial in origin and not originated from chickens.
Wickramasinghe and his mentor Fred Hoyle have also used their data to argue in favor of cosmic ancestry, and against evolution.
Like I said, big grain of salt. (thx, onno)
biologyscience | 科技 |
2016-40/3982/en_head.json.gz/15858 | SonyDigital camera reviews
Events & Trade shows
Photo Viewer
Sony Alpha 55 review : Sony is actually just a rookie in terms of DSLR cameras. Although they have vast experience, surging from their take-over of Konica Minolta, it was only a few years ago that the first Sony Alpha camera was brought out onto the market. With the introduction of the Sony Alpha 55, Sony seems to be making progress in leaps and bounds. The camera distinguishes itself from the competition with innovative techniques and different features with which Sony proves itself as a DSLR manufacturer. The Sony A55 is the first Alpha camera that is based on the so-called Translucent Mirror technique. This is in fact an old technique, modernized and given a facelift. Sony A580 review
Sony Alpha 580 review : There is an interesting development occurring in the Sony camera division. While last year we could only choose between Cybershot and Alpha DSLR cameras, the choices have been significantly extended with only two new types of cameras. First of all, of course the very compact NEX system cameras that are very quickly becoming a sales hit. Secondly, the Sony Alpha 55 and A33 cameras with Translucent Mirror technology, a new type of camera with interchangeable lens, but now with an electronic viewfinder and translucent, fixed mirror. In fact, it is no longer a real ‘reflex’. With the introduction of the Sony Alpha 580, the traditional DSLR is equipped with the most recent technology.
Sony Cyber-Shot review
Sony Cyber-Shot DSC-HX5 review : Although its appearance doesn’t betray it, the Sony Cyber-Shot DSC-HX5 is unique in its kind. In the first place, the Sony HX5 distinguishes itself with its image sensor. While 99% of compact cameras are equipped with a standard CCD sensor, the Sony DSC-HX5 is equipped with a so-called back-illuminated CMOS sensor. Additionally, this compact camera has a 10x optical zoom lens with a 25m wide angle, optical SteadyShot, AVCHD Full HD video (1920x1080 pixels) and the impressively performing Sweep Panorama function.• Read our full Sony CyberShot DSC-HX5 review.
Sony NEX review
Sony NEX-5 system camera review : Following Panasonic, Olympus and Samsung, Sony has also joined the road of the small system cameras. The interest in such compact camera systems is big, but strangely enough, the answer to this success was a long time coming by big names such as Canon, Nikon and Sony. We had to wait until the start of this year, when Sony showed the press a prototype camera, and then we knew that also Sony had joined the party. The expectations of what Sony was to produce were also big; in any case, it had to make us forget about the previous small camera systems.• Read our full Sony NEX review.
Sony CyberShot review
Sony CyberShot DSC-HX5 photo gallery : The Sony CyberShot HX5 is a small digital camera equipped with the newest technologies that the current Sony Cybershot generation cameras have. The Sony CyberShot DSC-HX5 is equipped with GPS and compass functionality, a feature that we strangely enough see only rarely in a digital camera. This is already a standard specification in modern smartphones, but digital SLR and compact cameras have yet to discover GPS. There are some glitches to it, such as interference in the LCD display and the image sensor, and those are components that you obviously really need. Sony has managed to avoid these impediments, making the Sony Cybershot DSC-HX5 digital camera pretty unique. Sony NEX-5 review
Sony NEX-5 review : Since the introduction of the Micro Four Thirds System, changes have been taking place on the DSLR market. A couple of manufacturers, Olympus and Panasonic, have set out a new course, which has obviously become a hit with the consumers. There is a lot of enthusiasm for Micro Four Thirds, causing others to also assign their R&D department to create a similar concept. Including Sony, who has been making headway since 2006 with their Alpha DSLR system, also sees a future in the small format system camera. During the European launch in Croatia, we were witness to Sony’s answer to the Micro Four Thirds System. The introduction of the Sony NEX-5 and Sony NEX-3 Alpha is now a fact. 1 Sony A99 II DSLR camera
4K movies produced by the Cyber-Shot RX10 III
Sony DSC-RX10 III
Sony CyberShot DSC-HX80
Sony A6300 Mirrorless digital camera
Sony Alpha 6300 records 4K UHD video
Sony RX1R II Premium Compact camera
Sony A7S II full-frame camera
Sony Cyber-shot RX compact camera series
Sony high zoom compact cameras
Sony Alpha A7 II
Sony A5100 system camera
Sony A7s
Sony CyberShot HX400V
Sony CyberShot H400 | 科技 |
2016-40/3982/en_head.json.gz/15893 | Report: Ericsson acquiring Nokia Siemens business support
updated 03:51 pm EDT, Mon September 3, 2012
Sale could fetch up to $377 million
The world's largest network infrastructure supplier Ericsson is the primary suitor to purchase the business support systems (BSS) unit of struggling competitor Nokia Siemens Networks. Sources familiar with the matter report that the BSS unit was among the assets Nokia was looking at selling, but a deal could not be confirmed. Nokia Siemens Network is in the process of cutting prices, and is expected to slash 17.000 of its workforce in an effort to improve its finances. Ericsson has been expanding rapidly with consistent quarterly profits, and is viewed as the leading candidate for the purchase of the BSS unit. Other potential buyers include US-based Amdocs Ltd, and other private equity firms. The business is expected to sell for as much as $377 million dollars.
"As we announced in November, Nokia Siemens Networks believes that the future of our industry is in mobile broadband and we are intensifying our strategic focus on that area," Nokia Siemens spokesman Ben Hunt said yesterday. Hunt declined to comment specifically on the sale of the BSS unit.
If the sale is completed, the sale of the unit would be the most recent in a string of asset sales. Nokia Siemens completed the sale of its microwave transport business in June, and finished the sale of its broadband access business in May. TAGS
Nokia Siemens Network | 科技 |
2016-40/3982/en_head.json.gz/15925 | Hitchhiker's Guide to the Galaxy game on the way
Wednesday, May 25th 2011 at 11:39AM BST
A brand new game based on The Hitchhiker's Guide to the Galaxy is in development.The small amount of information available can be found on the official website. It is being developed by Hothead Games, which is the team behind RPG title Deathspank and the Penny Arcade Adventures.There's no word yet on which platforms it will appear on or when it will be released."Megadodo Publications welcomes you to this new edition of the Hitchiker's Guide to the Galaxy," the brief blurb reads."We have updated The Guide's content to provide compatibility with such quaint locations as Earth. It will soon be available for all the latest primitive devices on your planet."A text-based adventure game based on the licence was released in the early '80s while two mobile games were released last decade to accompany the release of the 2005 Touchstone pictures film.
, hitchhikers guide to the galaxy
, hothead
The Hitchhiker's Guide to E3 Jun 9th 2016 at 12:47PM | 科技 |
2016-40/3982/en_head.json.gz/15996 | New, improved IOP dissipation system survives summer debut
The wave dissipation system is reportedly helping slow erosion in front of the Seascape Villas on Isle of Palms. PROVIDED
The wave dissipation system is helping slow erosion in front of the Seascape Villas on Isle of Palms, inventor Deron Nettles says.
After withstanding its first summer on Isle of Palms, creators of the wave dissipation system say the wall is working according to plan.
“It's working great,” said Deron Nettles, inventor of the dissipation wall protecting the Seascape Villas on Isle of Palms. “We saw the nor'easter come through earlier this summer. There's only a foot or two of the system exposed now. The system is such a small footprint and is allowing the beach's natural ebb and flow.”The 144-foot portable wall in place at the Wild Dunes resort is the second version of the system installed at the site and serves as an improved edition of its 88-foot trial predecessor.Structures like seawalls have been banned under state beach management laws since 1988, but on June 6, South Carolina Governor Nikki Haley signed bill S.1032 to “allow the use of pilot projects to address beach or dune erosion and to allow continued use of these projects under certain circumstances.”Nettles teamed up with Citadel engineering professor Tim Mays to tweak the system and collect results to verify its usefulness. The new wall is stronger, embedded up to 15 feet deep, while the original was closer to six. Nettles expects to release the summer's numerical results from the improved system later this week.“First we confirm the structure hasn't been damaged and it hasn't,” Mays said. “We use surveys to look at the volumes of sand in front of the structure, charts and plot graphs to analyze the data. I could not be happier with the results.“We can actually see the beach is lower around the sandbags and quickly goes up to the other side of the structure.”The polyvinyl and polyethylene system is designed to break up the energy of storm waves that are causing the erosion while allowing water and fine sand to pass back through the wall.The flow is intended to mimic that of a natural, unassisted beach. It also allows sand to slowly build behind the wall, another useful step in helping to preserve the eroding shores.“Realistically, the beach will always need nourishment, but at times, that resource of sand isn't available for certain areas,” Nettles said. “If the resource isn't available, you can install the system to protect the structure until it is. This isn't a permanent alternative, but a tool to work in conjunction with nourishment. And a better alternative than sandbags.”Following Hurricane Arthur and Tropical Storm Bertha that moved up the East Coast this summer, several of the sandbags that neighbor the dissipation system could be found empty, stuck in the sand and shoreline, while the dissipation system worked properly. Some who helped clean up the bags even placed them behind the dissipation wall, where they'd be protected from the ocean.“I grew up on these beaches my whole life and I want to protect the area by whatever mean necessary,” Nettles said. “It gets to the point, seeing these bags out there all the time, where it's pretty discouraging. That's one of the reasons I came up with the system.”The tip of Isle of Palms' extreme vulnerability to beach erosion made it an ideal test site for the system. Ocean and Coastal Resource Management is now encouraging the installation of the system in different areas to collect more data.Nettles has begun marketing the system in target areas with erosion issues. Garden City, Pawleys Island and Daufuskie Island are among the possibilities. Nettles says one of the system's greatest features is its ability to conform to most any area's needs from the dunes to underwater.“We're trying to allow homeowners in the state of South Carolina to have one more option to protect their properties,” Nettles said. Latest Videos | 科技 |
2016-40/3982/en_head.json.gz/16053 | LEEDCo ready to begin demonstration project that will place wind turbines in Lake Erie
The Lake Erie Energy Development Corp. is ready to begin a demonstration project to develop five to nine wind turbines off the coast of Cleveland in Lake Erie.
LEEDCo's "Icebreaker" project has been awarded $4 million in U.S. Department of Energy funding through Feb. 15, 2014. Private partners have committed an additional $1 million in cost share for this portion of the project.
"In one year, we will provide the federal government with a strong plan to provide clean, affordable and reliable offshore wind power to the electric grid and Northeast Ohio customers," LEEDCo President Lorry Wagner said in a statement. "I am convinced that we are poised to become the first freshwater wind farm in North America, which will spawn a new wind power industry in Northeast Ohio."
LEEDCo is a regional non-profit corporation leading efforts to create an offshore wind energy industry in Northeast Ohio.
The agency was founded in 2009 and members include Ashtabula, Cuyahoga, Lorain and Lake counties, city of Cleveland, The Cleveland Foundation and NorTech.
As a public-private partnership, the agency represents Northern Ohio's public interest in offshore wind and works to develop an initial 20 to 30 megawatt project in Lake Erie seven miles offshore Cleveland with a 1,000 MW target by 2020.
The Department of Energy will evaluate results of LEEDCo's demonstration project and the organization could be selected to receive an additional $46 million in funding during a four-year period.
To compete to win the next round of funding, LEEDCo aims to:
* Address critical technical objectives, including evaluating and selecting the optimal turbine foundation design; reviewing installation, operations and maintenance methodologies; researching the challenges and solutions for icing conditions; and assessing the technical and financial feasibility of the overall project.
* Complete necessary permit applications.
* Secure power purchase agreements with potential customers, and address initial interconnectivity considerations
Team members from around the world gathered in Cleveland recently to kick-off the project, discuss their approach and how they will collaborate on breakthrough technologies and materials.
"Think of Icebreaker in Lake Erie as the flagship of projects that could be installed in the Great Lakes, which has enormous offshore wind potential," said Walt Musial, manager, Offshore Wind and Ocean Power Systems at the National Renewable Energy Laboratory in Golden, Colo. | 科技 |
2016-40/3982/en_head.json.gz/16056 | Climate change All
France Welcomes India's Move to Ratify Climate Change Agreement
France has welcomed the Indian government’s decision to ratify the Paris climate change agreement, aimed at containing global warming, on October 2, the birth anniversary of Mahatma Gandhi.
UN Chief Ban Ki-Moon Lauds India's Decision to Ratify Paris Pact on Climate Change
A total of 60 countries so far have deposited their instruments of ratification for the agreement, representing more than 47.5 per cent of global greenhouse gas emissions.
Ban Ki-Moon Hopes India, Other Nations Will Ratify Climate Deal
UN Secretary General Ban Ki-Moon has expressed hope that India and other nations will soon ratify the Paris Climate Change agreement.
No Decision Yet on Ratifying Paris Climate Change Pact: India
During the Paris climate meet in 2015, more than 190 nations had agreed on setting ambitious goals for capping global warming and funnelling trillions of dollars to poor countries facing climate catastrophe.
G20 A Success For China, But Hard Issues Kicked Down The Road
There was even a joint announcement by China and United States that they would ratify the Paris climate change agreement, a significant step for the world's two biggest emitters of greenhouse gases.
Farmer's Son Witnesses Climate Change in the Sunderbans, Turns the Island's Fate for the Better
A farmer’s son in the Sunderbans has been single handedly trying to change the fate of one of the biggest mangroves in the world. Runa Mukherjee Parikh, August 30, 2016, 5:19 pm
Rich Nations Not Helping to End Poverty and Inequality: Report
Scoring at the bottom of the list were the Central African Republic and Liberia, while Sweden, Denmark, Norway, Finland and Switzerland topped the list.
Modi Demonstrated Indian Leadership on Climate Change: White House
Obama had an opportunity to meet Modi and the rest of the Indian delegation to those negotiations, to talk over what role India could play and what commitments India could make.
India, 170 Other Countries Sign Paris Climate Pact
Environment Minister Prakash Javadekar signed the agreement in the UN General Assembly hall at a high-level ceremony hosted by UN Secretary-General Ban Ki-moon.
WTO ruling on solar power: China supports India's stand
Environment Minister Prakash Javadekar said it was "unfortunate" that such a ruling had been given by WTO when India launched a 175 GW renewable energy programme and that India would file an appeal against it soon. April 7, 2016, 9:33 pm
Over 120 nations to sign climate deal in April: France
The deal only comes into force, however, if at least 55 countries responsible for at least 55% of global greenhouse gas emissions ratify the accord.
Satellite technology crucial in monitoring greenhouse gas emissions
The idea is to bring together all these ideas about satellite projects from different agencies to measure carbon and methane emissions in order to eventually achieve 'global coordination'.
explore: Tech
Google, Apple, Microsoft back Obama's 'clean power' plan
Google, Apple, Microsoft and Amazon filed a brief with the DC Circuit court in support of the program, noting that collectively they are among the biggest US consumers of electricity.
In pics: Arctic ice cover shrinks to a record low, thanks to global warming
Bodhisattva Sen Roy, March 30, 2016, 8:40 pm
NASA to send scientists on an around-the-globe mission to study climate change
While Earth science field experiments are nothing new for NASA, the next six months will be a particularly active period with eight major new campaigns taking researchers around the world on a wide range of science investigations. | 科技 |
2016-40/3982/en_head.json.gz/16074 | White House launches open government initiative
A New Device Approved by the FDA Could Radically Change the Lives of Diabetics
The Global Cost of Electronic Waste
Medicine at Forefront of White House Open Data Initiatives
Video: What Does Gut Bacteria Have to Do With Space Travel?
The administration launched a long-awaited open government initiative on Thursday, which included facelifts to federal Web sites, and garnered encouraging reviews from government transparency activists.
Details of the plan, which was first announced the day after President Obama took office, included three phases of public participation via cyberspace, e-mail and traditional mail.
The White House site called on citizens to brainstorm ideas for creating a more transparent, collaborative and participatory government using technology, and then vote on the concepts.
Online discussions are scheduled to start on June 3 based on what the White House identifies as the most compelling ideas. On June 15, the public will be able to collaborate on more formal recommendations through a wiki, a Web page that allows users to add and edit content.
Afterward, the White House will undertake a more traditional review of what the public engagement process yielded, said Beth Noveck, White House deputy chief technology officer for open government.
Government transparency groups had expected to see on Thursday a list of agency recommendations for establishing an open government. President Obama's announcement on Jan. 21 ordered the federal chief technology officer, along with the Office of Management and Budget director and the administrator of the General Services Administration, "to coordinate the development by appropriate executive departments and agencies, within 120 days, of recommendations for an open government directive."
Noveck said the launch is a reflection of what agency officials recommended. "It is not a finished draft on which people are being required to submit comments, as is the normal regulatory process," she said. "We did not request a formal agency statement of position."
Until this week, open government advocates had been criticizing the White House for dropping the ball on the initiative, because Noveck said in March that a public participation Web site was under construction.
She said the White House delayed the kick-off because of the late appointment of a key team member. Former Virginia Technology Secretary Aneesh Chopra, the federal CTO-designate, was not named until April 18.
"We are very much appreciative of the activists in the transparency community who push us and who keep us honest and make these demands," Noveck said. "It has long been in the works to make this an open process."
Officials with the Office of Science and Technology Policy, GSA and OMB wanted Chopra to be in Washington for the unveiling, she added. But Chopra, who has yet to be confirmed by the Senate, was not part of Thursday's announcement. Instead, Noveck, Chief Information Officer Vivek Kundra and the president's senior adviser Valerie Jarrett are listed on the site as leading the initiative.
Also on Thursday, Kundra debuted the Web site Data.gov, which is intended to provide the public with access to raw federal data in formats they can download and manipulate. Such information, including data sets on economic, health care and environmental topics, were previously available only in incompatible formats, scattered across multiple agency sites.
While the offerings on the site are limited, some Web standards specialists expect the government to provide extensive live data feeds.
In addition, Regulations.gov, the official site for submitting comments electronically, proposed a redesign, with a new homepage and an enhanced search capability. The public can voice their opinions on the model until June 21.
Noveck said she does not expect a large amount of feedback on the scale of Obama's March online town hall, where the public submitted 104,000 questions about the economy and cast 3.6 million votes for the most popular questions.
"This is a process that is focused on a specific set of themes and topics," she said. "We'll see what kind of volume we'll get. Whereas, I hope we'll get a lot of participation, it might not be as much as we've had [in the past]."
Sean Moulton, director of federal information policy at OMB Watch, an open government group, said, "This is certainly a good step forward."
The White House spells out the process and offers a direction where it is headed, he added. "The roadmap seems pretty fast, which makes me think they have moved pretty far down the road in developing the ideas themselves," Moulton noted. "I appreciate them giving the public the chance to bring up some things they may have missed."
He said he wants to see what the agencies had to say in their suggestions.
A May 18 letter to Noveck signed by more than 60 government transparency groups, including OMB Watch, states, "Given President Obama's determination to create 'an unprecedented level of openness in government,' we ask you make publicly available comments received from agencies, agency employees, or the public related to the development of an open government directive."
Kevin Novak, co-chairman of the e-government interest group at the World Wide Web Consortium (W3C), an organization that encourages standardized and improved Web programming language, said the open government strategy provides "a great opportunity" for W3C and others in the technical community to offer help.
Technical specialists can assist agencies in modifying their information technology systems to comply with existing international standards, "so that they do not have to recreate the wheel," he said. | 科技 |
2016-40/3982/en_head.json.gz/16092 | Got feedback?
By giving us your feedback, you can help improve your www.NOAA.gov experience. This short, anonymous survey only takes just a few minutes to complete 11 questions. Thank you for your input!
Give my feedback
About NOAA
Weather.gov Forecast
Active Weather Alerts
NOAA Organizations
Working With NOAA
Media & Constituents
NOAA In Your State
Emergency Information for NOAA Employees
Susan Buchanan
301-427-9000 Share
Facebook Google StumbleUpon Digg More Destinations Below-normal Atlantic hurricane season ends; active eastern and central Pacific seasons shatter records December 1, 2015
The 2015 Atlantic hurricane season ended with a below-normal 11 named storms, four of which became hurricanes. (Credit: NOAA)
The Atlantic, eastern and central Pacific hurricane seasons officially ended yesterday, and as predicted, the Atlantic season stayed below normal with 11 named storms, while the eastern and central Pacific were above normal with both regions shattering all-time records.
Overall, the Atlantic hurricane season produced 11 named storms, including four hurricanes (Danny, Fred, Joaquin and Kate), two of which, Danny and Joaquin, became major hurricanes. Although no hurricanes made landfall in the United States this year, two tropical storms – Ana and Bill – struck the northeastern coast of South Carolina and Texas, respectively. Ana caused minor wind damage, beach erosion and one direct death in North Carolina, and Bill produced heavy rain and flooding while it moved across eastern Texas and Oklahoma. Hurricane Joaquin is the first Category 4 hurricane since 1866 to impact the Bahamas during the month of October.
NOAA scientists credit El Niño as the leading climate factor influencing both the Atlantic and Pacific seasons this year.
“El Niño produces a see-saw effect, suppressing the Atlantic season while strengthening the eastern and central Pacific hurricane seasons,” said Gerry Bell, Ph.D., lead seasonal hurricane forecaster at NOAA’s Climate Prediction Center. “El Niño intensified into a strong event during the summer and significantly impacted all three hurricanes seasons during their peak months.”
Bell said El Niño suppressed the Atlantic season by producing strong vertical wind shear combined with increased atmospheric stability, stronger sinking motion and drier air across the tropical Atlantic, all of which make it difficult for tropical storms and hurricanes to form and strengthen. However, El Niño fueled the eastern and central Pacific seasons this year with the weakest vertical wind shear on record.
The 2015 eastern Pacific hurricane season ended with an above-normal 18 named storms, 13 of which became hurricanes. (Credit: NOAA)
Active Eastern and Central Pacific seasons
The eastern Pacific saw 18 named storms, including 13 hurricanes, nine of which became major. This is the first year since reliable record keeping began in 1971 that the eastern Pacific saw nine major hurricanes. Hurricane Patricia was the strongest hurricane on record in the Western Hemisphere in terms of maximum wind speed at 200 miles per hour and lowest air pressure at 879 millibars. Hurricane Sandra, which formed at the tail end of the season, was the strongest hurricane in the eastern Pacific so late in the year, with a maximum sustained wind speed of 145 miles per hour.
The central Pacific shattered its records too, with 14 named storms, including eight hurricanes, five of which became major hurricanes, the most active season since reliable record-keeping began in 1971. Three major hurricanes (Ignacio, Kilo and Jimena) churned at the same time east of the International Dateline, the first time that was ever recorded.
Hurricane research
The Atlantic hurricane season provided opportunities for NOAA to conduct research to benefit future forecasts. Highlights include:
More than 15 successful manned and unmanned aircraft missions into Hurricane Danny and Tropical Storm Erika to collect and provide real-time data to NOAA’s National Hurricane Center and evaluate forecast models. Using the tail Doppler radar aboard the NOAA P-3 hurricane hunter aircraft, researchers documented high levels of wind shear across the Caribbean, a major factor contributing to the dissipation of both Erika and Danny. Researchers from NOAA's Atlantic Oceanographic and Meteorological Laboratory tested new instruments such as a wind LIDAR that complements radar observations by measuring wind velocity in regions without rain.
NOAA's use of unmanned systems advanced this season with the first transmission of real-time data into operational hurricane models from NASA's Global Hawk, part of NOAA’s Sensing Hazards with Operational Unmanned Technology (SHOUT) project.
Below the ocean’s surface, two underwater gliders collected and transmitted real-time data on Tropical Storm Erika’s interaction with the upper ocean as the storm passed through the Caribbean. Hurricane hunter aircraft flew a total of 96 missions during the 2015 season; the U.S. Air Force Reserve 53rd Weather Squadron flew 75 missions and NOAA’s Aircraft Operations Center flew 21 missions.
NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on Facebook, Twitter, Instagram and our other social media channels. NOAA Mobile | Protecting Your Privacy | FOIA | Information Quality | Disclaimer | USA.gov |
Ready.gov | Site Map | | 科技 |
2016-40/3982/en_head.json.gz/16211 | Hands on With Optoma's Pocket Pico Projector
This year's International Consumer Electronics Show saw the debut of some of the first prototype video projectors that are as small as cell phones. Now, just 10 months later, the first commercial products based on this cool technology are ready and so I settled down to watch a movie on one of them and came away impressed.
Optoma's Pico Projector really fits in your hand. It's no exaggeration to say its the same size as a cell phone or, to compare it another way, about the same length as an iPod 5G but a little narrower and taller. The precise measurements are 10 centimeters long by 5cms wide by 1.5cms tall and it weighs 114 grams. There are just two controls on the device, a power switch and a focus control. Start-up takes a couple of seconds and then it's ready to go. The LED light is not only the secret to its small size but also means there is no waiting around for it to warm-up.
Video and audio is fed in through a three-terminal 2.5mm jack plug. Optoma supplies a cable with the jack plug on one end and female RCA connectors on the other for connecting up to anything that can supply a standard-definition PAL or NTSC composite video signal. I tried it with an iPod, set-top box and DVD player and it worked fine with all of them.
The unit can be set down on a surface, propped up with a book or attached to a tripod. The underside has a small screw in which a supplied adapter can be attached so it will work with a standard tripod.
Perched on my desk and projecting its image against a sheet of paper it delivered a bright and crisp image that was easy to enjoy. So much so that it wasn't until I got to the end of my second episode of "The Simpsons" that I remembered I was supposed to be working. And good timing too because the battery died half-way through the third episode.
After a recharge -- which takes 4 hours -- I selected half-brightness mode and was able to watch a 92 minute movie and 12 minutes of a TV show before the light went out. For the recharge, power is supplied via a mini USB connector but that's all the connector can be used for.
Audio is played through a small speaker built into the unit which, when it was on my desk, was just fine for a TV show but I think I'd want to hook it up to something more substantial if I was sitting back and watching a larger screen image, especially for a movie or music video. If you take it traveling with you then getting it to work on an iPod with a pair of headphones might be a hassle. The video and audio are combined on the iPod's headphone jack on older models so you'll have to come up with some patch leads to break off the video signal while still running audio to your headphones. This isn't a fault of Optoma's -- you'll have a problem using headphones when any other video display is hooked-up to an older iPod -- but something to watch out for. Newer iPods run the video signal out of the dock connector leaving the headphone socket free.
So, all-in-all it's an impressive product, especially if you're looking for something to make presentations to a small group of people while on the road. While it's bright enough to be seen at short range you'll have to turn the lights down if you want to get the full 60-inch image that the projector is capable of and won't replace a traditional projector for a room of people. For personal entertainment it works well too with the speaker but you'll have to figure out a cable if you want to use it on the move, say in a train or plane.
It will be available in major international markets from December.
The Optoma projector isn't the only one on the market. 3M, which was demonstrating the technology at CES, also has its own. The 3M "MPro 110" In addition to the video jack it also has a VGA port but offers a shorter battery life of between 40 and 60 minutes, according to 3M.
Microprojectors: Small But Mighty
Read more » Subscribe to the Power Tips Newsletter Comments
Email "Hands on With Optoma's Pocket Pico..."
Subscribe to the Power Tips Newsletter See All Newsletters » | 科技 |
2016-40/3982/en_head.json.gz/16303 | EmailA to ZContactsSite MapNewsMultimediaSearch Topics and PeopleShortcuts Other News Emergency Info Media Central Event Streaming Public Events Calendar Faculty News Student Publications The Daily Princetonian Campus Media Local News World News About PrincetonAcademicsAdmission & AidArtsInternationalLibraryResearch�Administration & ServicesCampus LifeVisiting CampusStudentsFaculty & StaffAlumniParents & FamiliesUndergraduate ApplicantsGraduate School ApplicantsMobile Princeton Web AppMobile Princeton App for AndroidMobile Princeton App for iOSConnect & SubscribeHome � News � Archive � Detection of cosmic effect may bring universe's formation into sharper focusNews at PrincetonFriday, Sept. 30, 2016News StoriesFAQsEvents & CalendarsMultimediaFor News MediaShare Your NewsCurrent StoriesFeaturesScience & TechPeopleEmergency AlertsUniversity BulletinArchive�Web StoriesTo News Archive|� Previous by Date|Next by Date �Detection of cosmic effect may bring universe's formation into sharper focus
Posted March 20, 2012; 08:00 a.m.by Morgan KellyTweet�e-mail
The first observation of a cosmic effect theorized 40 years ago could provide astronomers with a more precise tool for understanding the forces behind the universe's formation and growth, including the enigmatic phenomena of dark energy and dark matter.
A large research team from two major astronomy surveys reports in a paper submitted to the journal Physical Review Letters that scientists detected the movement of distant galaxy clusters via the kinematic Sunyaev-Zel'dovich (kSZ) effect, which has never before been seen. The paper was recently posted on the arXiv preprint database, and was initiated at Princeton University by lead author Nick Hand as part of his senior thesis. Fifty-eight collaborators from the Atacama Cosmology Telescope (ACT) and the Baryon Oscillation Spectroscopic Survey (BOSS) projects are listed as co-authors.
A large research team from two major astronomy surveys reports the first detection of the kinematic Sunyaev-Zel'dovich (kSZ) effect, which has a unique ability to pinpoint velocity and could be useful in understanding the expansion of the universe. As cosmic microwave background radiation left over from the Big Bang (top) moves through the universe, it becomes slightly redder and cooler if it passes through a galaxy cluster moving away from Earth (left). The radiation becomes bluer and hotter if it passes through a cluster moving toward Earth (right). The Atacama Cosmology Telescope (ACT) in Chile (bottom) detected the radiation background. The researchers combined data from the ACT project with data from the Baryon Oscillation Spectroscopic Survey (BOSS), which revealed galaxy cluster locations and changes in light due to movement. The study was initiated at Princeton and included 58 co-authors from ACT and BOSS. (Image by Sudeep Das, University of California-Berkeley)Proposed in 1972 by Russian physicists Rashid Sunyaev and Yakov Zel'dovich, the kSZ effect results when the hot gas in galaxy clusters distorts the cosmic microwave background radiation — which is the glow of the heat left over from the Big Bang — that fills our universe. Radiation passing through a galaxy cluster moving toward Earth appears hotter by a few millionths of a degree, while radiation passing through a cluster moving away appears slightly cooler.
Now that it has been detected, the kSZ effect could prove to be an exceptional tool for measuring the velocity of objects in the distant universe, the researchers report. It could provide insight into the strength of the gravitational forces pulling on galaxy clusters and other bodies. Chief among these forces are the still-hypothetical dark energy and dark matter, which are thought to drive the universe's expansion and the motions of galaxies.
In addition, the strength of the kSZ effect's signal depends on the distribution of electrons in and around galaxies. As a result, the effect also can be used to trace the location of atoms in the nearby universe, which can reveal how galaxies form.
The benefits of the kSZ effect stem from a unique ability to pinpoint velocity, said Hand, a 2011 Princeton graduate who is now a graduate student in astronomy at the University of California-Berkeley. The researchers detected the motion of galaxy clusters that are several billion light years away moving at velocities of up to 600 kilometers (372 miles) per second.
"Traditional methods of measuring velocities require very precise distance measurements, which is difficult. So, these methods are most useful when objects are closer to Earth," Hand said.
"One of the main advantages of the kSZ effect is that its magnitude is independent of a galaxy cluster's distance from us, so we can measure the velocity of an object's motion toward or away from Earth at much larger distances than we can now," Hand said. "In the future, it can provide an additional statistical check that is independent of our other methods of measuring cosmological parameters and understanding how the universe forms on a larger scale."
Pedro Ferreira, an astrophysics professor at the University of Oxford, called the paper a "beautiful piece of work" that neatly demonstrates an accurate method for studying the evolution of the universe and the distribution of matter in it. Ferreira had no role in the research but is familiar with it.
"This is the first time the kSZ effect has been unambiguously detected, which in and of itself is a really important result," Ferreira said.
"By probing how galaxies and clusters of galaxies move around in the universe, the kSZ effect is directly probing how objects gather and evolve in the universe," he said. "Therefore it is hugely dependent on dark matter and dark energy. You can then think of the kSZ effect as a completely new window on the large-scale structure of the universe."
Combining fundamentally different dataTo find the kSZ effect, the researchers combined and analyzed data from the ACT and BOSS projects. The kSZ effect is so small that it is not visible from the interaction with an individual galaxy cluster with the cosmic microwave background (CMB), but can be detected by compiling signals from several clusters, the researchers discovered.
ACT is a custom-designed 6-meter telescope in Chile built to produce a detailed map of the CMB using microwave frequencies. The ACT collaboration involves a dozen universities with leading contributions from Princeton and the University of Pennsylvania, and includes important detector technology from NASA's Goddard Space Flight Center, the National Institute of Standards and Technology, and the University of British Columbia.
BOSS, a visible-light survey based at the Apache Point Observatory in New Mexico, has captured spectra of thousands of luminous galaxies and quasars to improve understanding of the large-scale structure of the universe. BOSS is a part of the Sloan Digital Sky Survey III, the third phase of the most productive astronomy project in history, and a joint effort among 27 universities and institutions from around the world.
For the current project, researchers from ACT compiled a catalog of 27,291 luminous galaxies from BOSS that appeared in the same region of sky mapped by ACT between 2008 and 2010. Because each galaxy likely resides in a galaxy cluster, their positions were used to determine the locations of clusters that would distort the CMB radiation that was detected by ACT.
Hand used the 7,500 brightest galaxies from the BOSS data to uncover the predicted kSZ signal produced as galaxy clusters interacted with CMB radiation. ACT collaborator Arthur Kosowsky, an associate professor of physics and astronomy at the University of Pittsburgh, suggested a particular mathematical average that reflects the slight tendency for pairs of galaxy clusters to move toward each other due to their mutual gravitational attraction, which made the kSZ effect more apparent in the data.
The overlap of data from the two projects was essential because the amplitude of the signal from the kSZ effect is so small, said ACT collaborator David Spergel, professor and department chair of astrophysical sciences at Princeton, as well as Hand's senior thesis adviser. By averaging the ACT's CMB maps with thousands of BOSS galaxy locations, the kSZ signal got stronger in comparison to unrelated signals and measurement errors, Spergel said.
"The kSZ signal is small because the odds of a microwave hitting an electron while passing through a galaxy cluster are low, and the change in the microwave's energy from this collision is slight," said Spergel, the Charles A. Young Professor of Astronomy on the Class of 1897 Foundation. "Including several thousand galaxies in the dataset reduced distortion and we were left with a strong signal."
In fact, if analyzed separately, neither the ACT nor the BOSS data would have revealed the kSZ effect, Kosowsky said. "This result is a great example of an important scientific discovery relying on the rich data from more than one large astronomy survey," he said. "The researchers of the ACT and BOSS collaborations did not have this in mind when they first designed their experiments."
That is because the ACT and BOSS projects are fundamentally different, which makes the researchers' combination of data unique, said SDSS-III scientific spokesman Michael Wood-Vasey, an assistant professor of physics and astronomy at the University of Pittsburgh. The projects differ in the cosmic objects studied, the method of gathering data, and even the wavelengths in which they operate — microwaves for ACT, visible-light waves for BOSS.
"Collaborations between projects of this scale aren't common in my experience," Wood-Vasey said. "This also was a collaboration after the fact in the sense that the data-acquisition strategies for these projects was already set without thinking about this possibility. The insight of the key researchers on this project allowed them to combine the two datasets and make this measurement."
The paper was posted March 19 to the arXiv preprint database maintained by Cornell University. Support for ACT comes primarily from the National Science Foundation. ACT operates in the Chajnantor Science Preserve in northern Chile under the auspices of the Comisi�n Nacional de Investigaci�n Cientifica y Tecnol�gica. Funding for SDSS-III has been provided by the Alfred P. Sloan Foundation, the National Science Foundation, the U.S. Department of Energy Office of Science and participating institutions. | 科技 |
2016-40/3982/en_head.json.gz/16338 | What does the lag of CO2 behind temperature in ice cores tell us about global warming?
Filed under: FAQ
Paleoclimate — group @ 3 December 2004 - () This is an issue that is often misunderstood in the public sphere and media, so it is worth spending some time to explain it and clarify it. At least three careful ice core studies have shown that CO2 starts to rise about 800 years (600-1000 years) after Antarctic temperature during glacial terminations. These terminations are pronounced warming periods that mark the ends of the ice ages that happen every 100,000 years or so. Does this prove that CO2 doesn’t cause global warming? The answer is no.
The reason has to do with the fact that the warmings take about 5000 years to be complete. The lag is only 800 years. All that the lag shows is that CO2 did not cause the first 800 years of warming, out of the 5000 year trend. The other 4200 years of warming could in fact have been caused by CO2, as far as we can tell from this ice core data. The 4200 years of warming make up about 5/6 of the total warming. So CO2 could have caused the last 5/6 of the warming, but could not have caused the first 1/6 of the warming.
It comes as no surprise that other factors besides CO2 affect climate. Changes in the amount of summer sunshine, due to changes in the Earth’s orbit around the sun that happen every 21,000 years, have long been known to affect the comings and goings of ice ages. Atlantic ocean circulation slowdowns are thought to warm Antarctica, also.
From studying all the available data (not just ice cores), the probable sequence of events at a termination goes something like this. Some (currently unknown) process causes Antarctica and the surrounding ocean to warm. This process also causes CO2 to start rising, about 800 years later. Then CO2 further warms the whole planet, because of its heat-trapping properties. This leads to even further CO2 release. So CO2 during ice ages should be thought of as a “feedback”, much like the feedback that results from putting a microphone too near to a loudspeaker. In other words, CO2 does not initiate the warmings, but acts as an amplifier once they are underway. From model estimates, CO2 (along with other greenhouse gases CH4 and N2O) causes about half of the full glacial-to-interglacial warming.
So, in summary, the lag of CO2 behind temperature doesn’t tell us much about global warming. [But it may give us a very interesting clue about why CO2 rises at the ends of ice ages. The 800-year lag is about the amount of time required to flush out the deep ocean through natural ocean currents. So CO2 might be stored in the deep ocean during ice ages, and then get released when the climate warms.]
To read more about CO2 and ice cores, see Caillon et al., 2003, Science magazine
Guest Contributor: Jeff Severinghaus
Professor of Geosciences
University of California, San Diego.
Update May 2007: We have a fuller exposition of this on a more recent post.
Comments (pop-up) (4) 4 Responses to “What does the lag of CO2 behind temperature in ice cores tell us about global warming?”
1 alex says: 12 Dec 2004 at 3:01 AM
Although I don’t have data to support the following claims, and can’t recall specific papers on the topic (undoubtedly my following thoughts are some sort of compilation of papers I have read), a reasonable guess for the lag has to be based on the biological response of phytoplankton to changing environmental conditions. Offhand, it seems reasonable that as the Milankovitch (orbital) forcings initiate the onset of the glacial termination there may be some sort of decrease in the zonal sea surface temperature gradient which would in turn lead to less zonal winds. Less zonal winds would seem to reduce vertical mixing in the surface ocean and induce a subsequent stratification of surface waters. The surface waters may be additionally stratified by an increase in melt water remaining at the surface (reducing the salinity of surface water, consequently increasing water column stability). These combination of these factors do not bode well for carbon fixers such as phytoplankton who are dependent on intense vertical mixing as a source of upwelled nutrients. Less carbon fixation would result in more CO2 remaining in the atmosphere as the glacial termination proceeds, acting as a positive feedback to the initial forcing. Of course, this sort of large scale response would take a long time to occur, and may in fact not happen until well into the glacial termination. Just some thoughts. Perhaps someone with more study in the field could present the conventional wisdom on this topic.
2 Ferdinand Engelbeen says: 12 Dec 2004 at 2:12 PM
May I disagree with this article?
The correlation between CO2 and temperature in the pre-industrial 420,000 years, according to the Vostok ice core is surprisingly linear (some 8 ppmv for 1°C, see: http://home.scarlet.be/~ping5859/correlation.html ) and includes shorter (~800 years) for glacial-interglacial transitions and longer (up to thousands of years) lags for CO2 vs. temperature for interglacial-glacial transitions. See e.g. the previous interglacial at: http://home.scarlet.be/~ping5859/co2_temp_ice.html
While the fast glacial-interglacial transitions may hide which leads what and to what extent, the much slower (depending of the length of the interglacial) interglacial-glacial transitions make it clear. When the temperature decreases (7°C), CO2 levels remain high. When CO2 levels fall (some 50 ppmv), there is no measurable effect on temperature at all. This contradicts the possibility that CO2 plays an important role in the onset of both glaciations and deglaciations.
Response The above statement is incorrect. The correlation being discussed is between CO2 and deuterium/hydrogen isotope ratios in snow (as archived in an ice core), an imperfect measure of temperature. It has been shown that when the deuterium/hydrogen ratios are corrected for the influence of temperature changes at the ocean surface (from which the water that fell as snow originally evaporated), the correlation becomes even more remarkable, and the apparent drop in temperature thousands of years before the drop in CO2 disappears. See Cuffey and Vimeux, Science, 2002.
Further, even if we assume that the average of current climate models is right, the temperature increase from a CO2 doubling is around 3°C, or ~1°C for a 100 ppmv rise. Which is 1/10th of the >10°C rise seen in the last deglaciation.
Moreover, the Taylor Dome ice core, also reveales a lag of 1200 +/- 700 years between CO2 and temperature on shortes time scales (D-O events). See: http://www.ngdc.noaa.gov/paleo/taylor/indermuehle00grl.pdf
Rapid temperature swings like the end of the Younger Dryas (probably less than a few decades) are followed by CO2 changes.
Even the past 1,000 years, showed a lag of ~50 years of CO2 vs. temperature for the Law Dome ice core, but the temperature data disappeared from the Internet…
And since the industrial revolution, sea surface temperature changes, like El Niño, induce peaks in the CO2 increase rate, some 6 months after the onset of the event…
Thus all together, all historic data point to a lag of CO2 after temperature changes, without much influence of CO2 on temperature when that happens…
3 David Holland says: 13 Dec 2004 at 11:23 AM
Wow! Are you really saying that we have no idea what starts to warm up our world from an ice age but know with near certainty what has caused the warming of the last three decades?
From my now somewhat distant scientific education I recall that it takes some 80 times more heat to turn the ice to water than to raise its temperature by a single centigrade degree. With sea levels 125m or so lower a significant proportion of the planets water must have been in the form of ice. The â??unknown processâ?? you refer to would have had to supply far more extra heat than the CO2 feedback, which was able to take over some 800 years later.
4 tribe.net: www.realclimate.org says: 8 Feb 2005 at 1:06 PM
Re: Climate Change Deni
"www.commondreams.org/headlin…/0922-02.htm"
http://www.cnn.com/2003/TECH/sc…….. | 科技 |
2016-40/3982/en_head.json.gz/16352 | Meeting Aichi Biodiversity Targets For Protected Areas
PLOS Habitat loss is a primary driver of biodiversity loss – so it isn't surprising that optimising the amount of protected land is high on policy-makers' priorities. However, according to research to be published in the Open Access journal PLOS Biology on June 24 by Oscar Venter and colleagues, many protected areas are established in locations of low economic value, failing to protect the imperilled biodiversity found on more valuable land. More of the earth's land surface is set to be protected in the next decade, but the trend of using poor quality land seems set to continue. How can we optimise the number of species protected in the most cost efficient way? Venter & colleagues believe they have discovered part of the answer.
In 2010 the Convention on Biological Diversity adopted a new set of goals for the next decade – the Aichi Targets. These include the ambitious Target 11; to expand the global protected area network from the current 13% of the earth's land surface (not including Antarctica) to 17%, and Target 12; to halt the extinction of species already threatened by 2020. It follows that protecting habitats should also protect endangered species, so Target 11 and Target 12 should be interrelated. However, Venter and colleagues found that if the current methods of selecting new protected areas were continued (i.e. targeting land with little potential for agriculture) then achieving the 17% Aichi land preservation target would only protect a mere 249 more threatened vertebrate species than covered by current networks. This will do little to achieve Target 12.
"Our study shows that existing protected areas are performing very poorly in terms of protecting the world's most threatened species," said Dr. Oscar Venter, lead author of the study. "This is concerning, as protected areas are meant to act as strongholds for vulnerable species, which clearly they are not."
Venter & colleagues estimated that the total cost implication of protecting all 4,118 of the vertebrates they considered in their study would be $43 billion in 'lost opportunity' costs (in terms of not using the land for agriculture) – around 750% more than carrying on with the current strategy. How then can we reconcile the low cost but low conservation benefit scenario we have currently with the exceedingly high economic costs to achieve Aichi Target 11's 17% land protection goals? Importantly, the study highlighted that small increments of higher lost-opportunity cost lead to proportionately larger increments of adequate protection of threatened species. For instance, achieving a 400% increase in the adequate protection of threatened species only costs 50% more, in terms of lost-opportunity cost, than the 'business as- usual' strategy.
There are caveats to the study; for example only mammals were considered and many species are threatened by processes other than habitat loss. However, their results point to the possibility of a 'happy medium' where countries can gain significant biodiversity benefits with minimal lost-opportunity costs. They provide a piece of a better road map towards making these goals achievable - where other conservation targets have failed. | 科技 |
2016-40/3982/en_head.json.gz/16431 | Hot Seat: Rallying Around Radio Spectrum
Michael Sherman, president and CEO of AES Corp., a provider of wide area wireless mesh communications equipment, discusses three bills under consideration in Congress could potentially result in the auctioning of radio spectrum used by the alarm industry.
By Rodney Bosch
· June 14, 2011
Michael ShermanPresident and CEOAES Corp.
Security Resource
Three bills under consideration in Congress could potentially result in the auctioning of radio spectrum used by the alarm industry. A House bill specifically calls for auctioning of the 450 to 470MHz spectrum, which is used to transmit signals from homes and businesses to monitoring centers. Two Senate bills would auction unspecified spectrum to finance a public safety network or other programs. Michael Sherman, president and CEO of AES Corp., a provider of wide area wireless mesh communications equipment, discusses the subject. The company’s AES-IntelliNet solution is a primary user of a portion of the spectrum being considered for auctioning.
Are you confident the alarm industry can safeguard the frequencies it uses?
I am totally confident that the frequencies in question will remain as they are for the security industry. Basically, the frequencies were selected in error by a consultant to Congress. What may not be understood is that there are many millions of pieces of equipment on these bands - from the millions of drive-through establishments to the railroad industries that rely on the availability of these frequencies. The use of this band by the alarm industry represents less than 1 percent of the total usage of the frequency band in question. The users that represent the other 99 percent are also dedicated and focused on their continued use of this frequency.
These frequencies represent almost every American institution in the country. To give you an indication, here are some of the constituents that would be affected by selling off these frequencies: all public safety agencies; the entire forest industry; hundreds of fire and police for cities and towns across the nation; the utility industry; the highway departments in many cities and towns; the petroleum industry; 800,000 radio amateurs; plus, the McDonald’s and Dunkin’ Donuts drive-throughs use these frequencies.
To remove these frequencies from use would be an attack on the fabric and character of the American experience. Also, there are already indications that these frequencies are off the list for all the above reasons and more. Politics being what it is, no politician can attack such a broad cross section of America especially if other options are available, and there are many other options available.
What can installing security contractors do to help oppose the auctioning?
The CSAA [Central Station Alarm Association] and all the other alarm industry institutions have mobilized their membership to write their individual members of Congress and senators to notify them that we are not in favor of this bill and that it should be withdrawn from consideration. This effort has been going on for months and should not stop until the bill and/or the provision for these frequency bands is permanently removed. This is and will continue to be a grassroots effort — from the smallest state alarm associations to the national alarm associations.
If an auction was to proceed, how soon would its impact be felt?
First, one must know that the bands will not just disappear; they will be relocated to a different part of the radio spectrum. The FCC has done this before when it moved the broadcasters out of the 1.9GHz band in favor of PCS radio, the forerunner of cellular. In order to take over these frequencies the following was required to happen first: A new piece of spectrum that was unused was located; the users of the frequency were given a few years to relocate; the cell company had to pay the broadcasters the full cost to relocate onto the new spectrum. This included the cost of the equipment, labor and all costs associated with the move. This worked for this band as moving a few hundred broadcasters was cheap relative to the value of this new frequency.
What is of key importance here is that if a move was to become a reality, then the auction winner would be required to pay the alarm companies (as well as the millions of other users) their full cost to move to the new band.
This is not what happened in the past with the AMPS cellular sunset. In that sunset the cellular companies turned off the cell service and forced the alarm companies to replace all the equipment at the alarm companies’ expense. This will again happen when the GSM networks are replaced with the new high speed LTE [4G] networks. This time the industry will be stuck replacing millions of GSM radios at their cost. If the FCC can find 40MHz of unused frequency to move the millions of users to this new piece of spectrum, then why don’t they just use this new spectrum from the very start? This is why it is becoming clear to everyone that moving millions of users is not politically or financially a good idea.
Will the demise of POTS allow for new opportunities in the alarm industry? This is an opportunity for the alarm industry to embrace and add new technologies to its service offerings. There is pressure on the industry to do more than it has done in the past. I do believe that the need to deal with the changing communication infrastructure will require the dealers to rethink how they do things and perhaps stretch a little more when it comes to moving away from their classic mode of operation.
Page 1 of 2 pages 1
Business Management · AES · Hot Seat · Legislation · Michael Sherman · Radio Spectrum · All Topics
Rodney Bosch
Although Bosch’s name is quite familiar to those in the security industry, his previous experience has been in daily newspaper journalism. Prior to joining SECURITY SALES & INTEGRATION in 2006, he spent 15 years with the Los Angeles Times, where he performed a wide assortment of editorial responsibilities, including feature and metro department assignments as well as content producing for latimes.com. Bosch is a graduate of California State University, Fresno with a degree in Mass Communication & Journalism. In 2007, he successfully completed the National Burglar and Fire Alarm Association’s National Training School coursework to become a Certified Level I Alarm Technician.
Contact Rodney Bosch: [email protected]
View More by Rodney Bosch
AES, Hot Seat, Legislation, Michael Sherman, Radio Spectrum | 科技 |
2016-40/3982/en_head.json.gz/16499 | Envisat Image: Siberia As Seen From Space
From: European Space Agency Posted: Friday, November 30, 2012 North central Siberia is pictured in this Envisat image from 5 March 2012. An enormous area in north Asia, Siberia spreads from the Urals in the west to the Okhotsk Sea in the east, from the Arctic Ocean in the north to the borders of Kazakhstan, Mongolia and China in the south.
In the lower-left corner we can see the Yenisei river, which flows north into the Kara Sea (not pictured). The Yenisei is considered to be the boundary between eastern and western Siberia. The majority of the area pictured lies above the Arctic Circle. This is also an area of continuous permafrost, where the soil is at or below freezing throughout the year.
About half of the world's underground organic carbon is found in northern permafrost regions. This is more than double the amount of carbon in the atmosphere in the form of the greenhouse gases carbon dioxide and methane.
The effects of climate change are most severe and rapid in the Arctic, causing the permafrost to thaw. When it does, it releases greenhouse gases into the atmosphere, exacerbating the effects of climate change.
Although permafrost cannot be directly measured from space, factors such as surface temperature, land cover and snow parameters, soil moisture and terrain changes can be captured by satellites. This image also shows part of the Putorana Mountains and the Putoransky State Nature Reserve. In the native language of the Evenks, 'Putorana' means 'the country of lakes with steep banks.'
Listed on the UNESCO World Heritage List, this area contains arctic and subarctic ecosystems, as well as a major reindeer migration route. Larger image
More status reports and news releases or top stories.
This Week at NASA: Journey to Mars update, Rosetta Mission End and More
NASA ISS Space to Ground Weekly Report - 30 September 2016
Congress Hearing: National Security Space - 21st Century Challenges, 20th Century Organization
Rosetta Ends Its Mission By Slowly Crash Landing On A Comet
NASA International Space Station On-Orbit Status 29 September 2016
Video: Rosetta Mission Comes to an End and Science Highlights
Mars' Crust May Contribute To Its Atmosphere
Dust Spirals Embrace A Young Star
Regional Forecasts of Solar Storms Will Start Soon | 科技 |
2016-40/3982/en_head.json.gz/16622 | 150-foot asteroid will buzz Earth, no need to duck
Published February 08. 2013 7:00AM | Updated February 08. 2013 7:54AM
CAPE CANAVERAL, Fla. (AP) — A 150-foot-wide asteroid will come remarkably close to Earth next week, even closer than high-flying communication and weather satellites. It will be the nearest known flyby for an object of this size.But don't worry. Scientists promise the megarock will be at least 17,100 miles away when it zips past next Friday."No Earth impact is possible," Donald Yeomans, manager of NASA's Near-Earth Object program at Jet Propulsion Laboratory in Pasadena, Calif., said Thursday.Even the chance of an asteroid-satellite run-in is extremely remote, Yeomans and other scientists noted. A few hundred satellites orbit at 22,300 miles, higher than the asteroid's path, although operators are being warned about the incoming object for tracking purposes."No one has raised a red flag, nor will they," Yeomans told reporters. "I certainly don't anticipate any problems whatsoever."Impossible to see with the naked eye, the asteroid is considered small as these things go. By contrast, the one that took out the dinosaurs 65 million years ago was 6 miles wide.Yet Asteroid 2012 DA14, as it's known for its discovery date, still could pack a wallop.If it impacted Earth — which it won't, scientists were quick to add Thursday — it would release the energy equivalent of 2.4 million tons of TNT and wipe out 750 square miles. That's what happened in Siberia in 1908, when forest land around the Tunguska River was flattened by a slightly smaller asteroid that exploded about five miles above ground.The likelihood of something this size striking Earth is once in every 1,200 years. A close, harmless encounter like this is thought to occur every 40 years.The bulk of the solar system's asteroids are located between the orbits of Mars and Jupiter, and remain stable there for billions of years. Some occasionally pop out, though, into Earth's neighborhoodThe closest approach of this one will occur next Friday afternoon, Eastern time, over Indonesia.There won't be much of a show. The asteroid will zip by at 17,400 mph. That's roughly eight times faster than a bullet from a high-speed rifle.The asteroid will be invisible to the naked eye and even with binoculars and telescopes will appear as a small point of light. The prime viewing locations will be in Asia, Australia and eastern Europe.Observers in the U.S. can pretty much forget it. Astronomers using NASA's deep-space antenna in California's Mojave Desert will have to wait eight hours after the closest approach to capture radar images.Scientists welcome whatever pictures they get. The asteroid offers a unique opportunity to observe something this big and close, and any new knowledge will help if and when another killer asteroid is headed Earth's way.The close approach also highlights the need to keep track of what's out there, if for no other reason than to protect the planet.NASA's current count of near-Earth objects: just short of 10,000, the result of a concentrated effort for the past 15 years. That's thought to represent less than 10 percent of the objects out there.No one has ruled out a serious Earth impact, although the probability is said to be extremely low."We don't have all the money in the world to do this kind of work" for tracking and potentially deflecting asteroids, said Lindley Johnson, an executive with the Near-Earth Object observations program in Washington.Indeed, when asked about NASA's plans to send astronauts to an asteroid in the decades ahead, as outlined a few years ago by President Barack Obama, Johnson said the space agency is looking at a number of options for human explorations.One of the more immediate steps, planned for 2016, is the launch of a spacecraft to fly to a much bigger asteroid, collect samples and return them to Earth in 2023.As for Asteroid 2012 DA14 — discovered last year by astronomers in Spain — scientists suspect it's made of silicate rock, but aren't sure. Its shape and precise size also are mysteries.What they do know with certainty:"This object's orbit is so well known that there's no chance of a collision," Yeomans repeated during Thursday's news conference.Its close approach, in fact, will alter its orbit around the sun in such a way as to keep it out of Earth's neighborhood, at least in the foreseeable future, Yeomans said.Johnson anticipates no "sky is falling thing" related to next week's flyby.He and other scientists urged journalists to keep the close encounter in perspective."Space rocks hit the Earth's atmosphere on a daily basis. Basketball-size objects come in daily. Volkswagen-size objects come in every couple of weeks," Yeomans said.The grand total of stuff hitting the atmosphere every day? "About 100 tons," according to Yeoman, though most of it arrives harmlessly as sand-sized particles. | 科技 |
2016-40/3982/en_head.json.gz/16650 | Sony is embarrassed again as Japan blocks Playstation Network restart
Gamers surely can't take any more of this
Asavin Wattanajantra 16 May 2011
JAPANESE GIANT Sony's saga of embarrassment over the hacking of its Playstation Network continues, with Japan refusing to let it switch the network back on until the company sorts it out.
Dow Jones Newswire quoted a Japanese regulatory official, revealing that Sony still hasn't come forward to say it has completed the security measures it promised at the beginning of May. Sony was also asked to come up with something to convince people that it will be safe to use the network now and in the future, and it hasn't done this either.
The Japanese must be a little more strict with Sony than countries in other regions that have already let it start switching the network back on again, such as the US, Europe, Australia, New Zealand and the Middle East.
Last month saw the start of the debacle when a severe security breach of the Playstation Network and Qriocity services compromised the confidential financial information of as many as 77 million users, including names, passwords and possibly credit card information.
Around a week later Sony admitted that 25 million more users with Sony Online Entertainment accounts might have had their information stolen. As if that wasn't bad enough, it doesn't account for the inconvenience and anger of Playstation gamers around the world who were prevented from using the gaming network until now.
The fact that Sony hasn't done what its home country clearly asked it to do shows that even if the firm is learning some lessons from this whole wretched incident, it's learning far too slowly. µ | 科技 |
2016-40/3982/en_head.json.gz/16709 | Place an Ad Business | World/National Business Project aims to track big city carbon footprints
LOS ANGELES � Every time Los Angeles exhales, odd-looking gadgets anchored in the mountains above the city trace the invisible puffs of carbon dioxide, methane and other greenhouse gases that waft skyward. Halfway around the globe, similar contraptions atop the Eiffel Tower and elsewhere around Paris keep a pulse on emissions from smokestacks and automobile tailpipes. And there is talk of outfitting Sao Paulo, Brazil, with sensors that sniff the byproducts of burning fossil fuels.It�s part of a budding effort to track the carbon footprints of megacities, urban hubs with over 10 million people that are increasingly responsible for human-caused global warming.For years, carbon dioxide and other greenhouse pollutants have been closely monitored around the planet by stations on the ground and in space. Last week, worldwide levels of carbon dioxide reached 400 parts per million at a Hawaii station that sets the global benchmark � a concentration not seen in millions of years. Now, some scientists are eyeing large cities � with LA and Paris as guinea pigs � and aiming to observe emissions in the atmosphere as a first step toward independently verifying whether local � and often lofty � climate goals are being met. For the past year, a high-tech sensor poking out from a converted shipping container has stared at the Los Angeles basin from its mile-high perch on Mount Wilson, a peak in the San Gabriel Mountains that�s home to a famous observatory and communication towers.Like a satellite gazing down on Earth, it scans more than two dozen points from the inland desert to the coast. Every few minutes, it rumbles to life as it automatically sweeps the horizon, measuring sunlight bouncing off the surface for the unique fingerprint of carbon dioxide and other heat-trapping gases. In a storage room next door, commercially available instruments that typically monitor air quality double as climate sniffers. And in nearby Pasadena, a refurbished vintage solar telescope on the roof of a laboratory on the California Institute of Technology campus captures sunlight and sends it down a shaft 60 feet below where a prism-like instrument separates out carbon dioxide molecules. On a recent April afternoon atop Mount Wilson, a brown haze hung over the city, the accumulation of dust and smoke particles in the atmosphere. �There are some days where we can see 150 miles way out to the Channel Islands and there are some days where we have trouble even seeing what�s down here in the foreground,� said Stanley Sander, a senior research scientist at the NASA Jet Propulsion Laboratory.What Sander and others are after are the mostly invisible greenhouse gases spewing from factories and freeways below.There are plans to expand the network. This summer, technicians will install commercial gas analyzers at a dozen more rooftops around the greater LA region. Scientists also plan to drive around the city in a Prius outfitted with a portable emission-measuring device and fly a research aircraft to pinpoint methane hotspots from the sky (A well-known natural source is the La Brea Tar Pits in the heart of LA where underground bacteria burp bubbles of methane gas to the surface.) Six years ago, elected officials vowed to reduce emissions to 35 percent below 1990 levels by 2030 by shifting to renewable energy and weaning the city�s dependence on out-of-state coal-fired plants, greening the twin port complex and airports and retrofitting city buildings. It�s impractical to blanket the city with instruments so scientists rely on a handful of sensors and use computer models to work backward to determine the sources of the emissions and whether they�re increasing. They won�t be able to zero in on an offending street or a landfill, but they hope to be able to tell whether switching buses from diesel to alternative fuel has made a dent. Project manager Riley Duren of JPL said it�ll take several years of monitoring to know whether LA is on track to reach its goal. Scientists not involved with the project say it makes sense to dissect emissions on a city level to confirm whether certain strategies to curb greenhouse gases are working. But they�re divided about the focus. Allen Robinson, an air quality expert at Carnegie Mellon University, said he prefers more attention paid to measuring a city�s methane emissions since scientists know less about them than carbon dioxide release. Nearly 58 percent of California�s carbon dioxide emissions in 2010 came from gasoline-powered vehicles, according to the U.S. Energy Department�s latest figures.In much of the country, coal �usually as fuel for electric power � is a major source of carbon dioxide pollution. But in California, it�s responsible for a tad more than 1 percent of the state�s carbon dioxide emissions. Natural gas, considered a cleaner fuel, spews one third of the state�s carbon dioxide.Overall, California in 2010 released about 408 million tons of carbon dioxide into the air. The state�s carbon dioxide pollution is greater than all but 20 countries and is just ahead of Spain�s emissions. In 2010, California put nearly 11 tons of carbon dioxide into the air for every person, which is lower than the national average of 20 tons per person.Gregg Marland, an Appalachian State University professor who has tracked worldwide emissions for the Energy Department, said there�s value in learning about a city�s emissions and testing techniques. �I don�t think we need to try this in many places, but we have to try some to see what works and what we can do,� he said.Launching the monitoring project came with the usual growing pains. In Paris, a carbon sniffer originally tucked away in the Eiffel Tower�s observation deck had to be moved to a higher floor that�s off-limits to the public after tourists� exhaling interfered with the data. So far, $3 million have been spent on the U.S. effort with funding from federal, state and private groups. The French, backed by different sponsors, have spent roughly the same.Scientists hope to strengthen their ground measurements with upcoming launches of Earth satellites designed to track carbon dioxide from orbit. The field experiment does not yet extend to China, by far the world�s biggest carbon dioxide polluter. But it�s a start, experts say. With the focus on megacities, others have worked to decipher the carbon footprint of smaller places like Indianapolis, Boston and Oakland, where University of California, Berkeley researchers have taken a different tack and blanketed school rooftops with relatively inexpensive sensors. �We are at a very early stage of knowing the best strategy, and need to learn the pros and cons of different approaches,� said Inez Fung, a professor of atmospheric science at Berkeley who has no role in the various projects. | 科技 |
2016-40/3982/en_head.json.gz/16717 | http://www.timesunion.com/opinion/article/Group-views-climate-change-as-an-economic-4983449.php
Group views climate change as an economic opportunity
The following is from a New York Times editorial:
In an effort to compensate for the failure of central governments to address the dangers of climate change with comprehensive national policies, cities, states and regions have developed their own strategies. California's ambitious plan aims to reduce emissions 80 percent by 2050 by requiring cleaner cars, more energy-efficient buildings and renewable fuels. Nine states have joined together to cut power-plant emissions.
Now a Canadian province, British Columbia, has joined California, Oregon and Washington in a new regional scheme called the Pacific Coast Action Plan on Climate and Energy. The agreement is not legally binding and contains no new money, but the overall objective is to step up the adoption of clean energy and link carbon pricing plans. California, like the northeastern states, has a cap-and-trade program; British Columbia has had a carbon tax for five years. The governors of Oregon and Washington, both Democrats, are working on ways to price carbon, though they could face tough going in their state legislatures.
The new coalition has been largely inspired by impatience with Congress and the Canadian Parliament, both of which are in thrall to oil and seem deeply resistant to the reality of climate change. The coalition should have a lot of clout. The three states and British Columbia have a good economy, a population of more than 50 million people, a history of technological innovation, and a front-row on the ocean, where climate change will have a profound effect.
The trick will be to smooth out differences and bring Oregon and Washington along, but the idea is sound. This regional coalition rightly sees the task of limiting climate change not as an economic threat but an economic opportunity, a chance to create new jobs and new industries. | 科技 |
2016-40/3982/en_head.json.gz/16736 | » Breaking News
Adobe CEO Chizen leaves in surprise move
Boston, November 13, 2007
Adobe Systems said chief executive Bruce Chizen is stepping down, a surprise move that comes as the software maker seeks to develop new ways to tap the Internet.
Shantanu Narayen, a former Apple manager who has been with Adobe for nearly 10 years, will succeed Chizen on December 1, the company said.
Chizen, who transformed Adobe from a maker of publishing and graphics software into a major supplier of diverse design, media, and business tools in his 14 years at the company, will stay on the board through the spring of 2008 and continue in a strategic advisory role through the end of fiscal 2008.
"Nobody ever likes any sort of transition, but this is definitely the guy to take it on," Piper Jaffray analyst Gene Munster said of Narayen. "The business is doing very well. I think this is just a case of a CEO who is retiring."
Adobe, known for its Photoshop picture editing software and its Acrobat programme to create and exchange documents online, also said it expected fourth-quarter revenue to be near the high-end of its target range of $860 million to $890 million.
Wall Street was looking for fourth-quarter revenue of $884 million, according to Reuters Estimates.
While the news surprised some investors, Chizen, 52, said Adobe's board had been aware that he planned to retire at a relatively young age.
"It's a lifetime decision. It's not one of those where I woke up one morning and said, 'Gee I want to take a break,'" Chizen said.
"What I need them to understand is that it was my personal decision and I wanted to do it at a time when it was best for Adobe," Chizen said.
Chizen is stepping down following major upgrades of Adobe's main design software, but ahead of the planned launch next spring of new technology it hopes will form the backbone of a Web-based computing system that could eventually challenge its chief rival, Microsoft.
"It is the biggest opportunity and it will be Adobe's biggest challenge," Chizen told analysts on a conference call. "Overcoming that challenge, I won't say it will be easy. But certainly it will be doable."
Narayen, 44, held several management positions at Apple before founding a digital photo-sharing software company in 1996. He also served as head of desktop and collaboration products at Silicon Graphics.Reuters Tags:
Adobe Systems | More Breaking News Stories | 科技 |
2016-40/3982/en_head.json.gz/16737 | TRAFFIC programme
TRAFFIC and CBD
TRAFFIC and CITES
Elephants & ivory
Sharks and rays
Timber trade
TRAFFIC Bulletin
Publications by Species
TRAFFIC Newsletters
Proceedings & workshops
External TRAFFIC publications
CITES CoP17 TRAFFIC at CITES CoP17
IUCN Congress TRAFFIC at the IUCN World Conservation Congress
International Agreements CBD l CITES l CMS
...............................................................
Focus on Behaviour change l Conservation awareness l Enforcement
Iconic wildlife Apes l Bears l Deer l Elephants l Leopards l Marine turtles l Pangolins l Reptiles l Rhinos l Sharks & rays l Tigers l others
Forestry Timber trade
Fisheries Fisheries regulation
Medicinal plants Medicinal and aromatic plants
Wildmeat Wildmeat resources
Pets & fashion Wild animals used for pets & fashion
Regions Africa l Americas l Asia l Australasia l Europe l Middle East
Search TRAFFIC NOTE: Please see instructions here to search inside TRAFFIC's PDFs
Subscribe to news Subscribe to e-Dispatches(weekly TRAFFIC email newsletter)
Enter your Email: Wildlife Trade News RSS
Donors Who supports our work
TRAFFIC is grateful for the financial contribution from The Rufford Foundation towards this website
Also of interest Wildlife crime is serious - watch the video!
innovate. fight crime. save wildlife.
Interested in a Masters in Conservation Leadership at the University of Cambridge? More details...
ROUTES Partnership
Reducing Opportunities for Unlawful Transport of Endangered Species
Affiliations TRAFFIC is a member of:
...............................................................TRAFFIC is a founder partner of:
Collaborative Partnership on Sustainable Wildlife Management (CPW)
TRAFFIC is a member of:
Useful links TRAFFIC is not responsible for the content of external internet sites WWF
TuesdayJan152013
Rhinos in crisis – poaching and illegal trade reach highest levels in over 20 years On the run: rhinos are being poached in unprecedented numbers to meet the demand for their horn in Asia, particularly Viet Nam © Edward Parker / WWF-Canon Gland, Switzerland, 15th January 2013—Escalating levels of poaching and illegal trade in rhino horns are seriously undermining rhino conservation efforts, putting the survival of these species at risk—according to a report by IUCN (International Union for Conservation of Nature) and TRAFFIC.The report examines the conservation status and trade in African and Asian rhino species. “The findings of the report are alarming,” says Tom Milliken, a rhino expert from TRAFFIC. “Today, rhino poaching and illegal horn trade are at their highest levels in over 20 years, threatening to reverse years of conservation effort, particularly in Africa. There is no doubt that rhino species are facing a serious crisis.” According to the report, by the beginning of 2011 there were 20,165 White Rhinoceros Ceratotherium simum and 4,880 Black Rhinoceros Diceros bicornis in Africa. However, at least 1,997 rhinos were poached between 2006 and September 2012 and over 4,000 rhino horns have been illegally exported from Africa since 2009, with an estimated 92% of these coming from rhinos specifically killed to obtain their horn.South Africa, home to 83% of Africa’s rhinos and 73% of all wild rhinos worldwide, is the principal source of rhino horns in illegal trade. A record 668 rhinos were poached there in 2012, according to official government figures released in January 2013. Illegal trade in rhino horns involves highly organised, mobile and well-financed criminal groups, mainly composed of Asian nationals based in Africa. These networks have recruited pseudo-hunters including Vietnamese citizens, Thai prostitutes and proxy hunters from the Czech Republic and Poland to obtain rhino horns in South Africa on the pretence of trophy hunts for illegal commercial trade purposes. Pseudo-hunting has significantly reduced as a result of a decision to prevent nationals of Viet Nam from obtaining hunting licenses and changes to South African law in April 2012. However, there remains a continued need to ensure that only bona fide hunters are granted permits, according to the report. “Rhinos are killed for their horns, which are seen as highly desirable status symbols in parts of Asia, notably Viet Nam, but also increasingly in China,” says Bibhab Kumar Talukdar, Chair of IUCN Species Survival Commission’s (SSC) Asian Rhino Specialist Group. “Horns are also increasingly used for non-traditional purposes such as hangover cure and body detoxifyer, especially in Viet Nam.” In Asia, although conservation action in Nepal and India has resulted in increased numbers of the Greater One-horned Rhinoceros Rhinoceros unicornis, the situation in Indonesia and Malaysia remains serious for the world’s two rarest rhino species—the Sumatran Rhinoceros Dicerorhinus sumatrensis and the Javan Rhinoceros Rhinoceros sondaicus. The Javan Rhinoceros, with only around 35 to 45 surviving individuals, is confined to a single park in Indonesia after the last animal of its Indochinese subspecies, Rhinoceros sondaicus annamiticus, was found dead, its horn removed, in Viet Nam in 2010. The report calls for enhanced protection and biological management of the remaining Sumatran and Javan Rhinoceros to prevent their extinction. Thefts of rhino horns from museums and zoos have increased worldwide, creating the need for improved law enforcement, monitoring and enhanced information management with regards to rhino numbers, sales and translocations, the report finds.“Trade in rhino horns is a global problem that needs to be addressed by the international community by putting pressure on those countries that are driving illegal trade in rhino horn and those with inadequate wildlife legislation, such as Mozambique,” says Richard Emslie, from IUCN SSC African Rhino Specialist Group. “At the same time, increased poaching is negatively affecting rhino conservation incentives and budgets, threatening future rhino population growth.”The report was compiled by the IUCN SSC African and Asian Rhino Specialist Groups and TRAFFIC, the wildlife trade monitoring network. It was mandated by the Convention on International Trade in Endangered Species (CITES) and aims to inform the rhino horn debate at the 16th meeting of the Conference of the Parties to CITES, taking place in March 2013 in Bangkok, Thailand.Read the full report http://www.cites.org/eng/cop/16/doc/E-CoP16-54-02.pdfEditor’s notes: In August 2012, TRAFFIC published a comprehensive report into the rhino horn trade between South Africa and Viet Nam: The South Africa—Viet Nam Rhino Horn Trade Nexus: A deadly combination of institutional lapses, corrupt wildlife industry professionals and Asian crime syndicates (PDF, 4 MB)For more information or to set up interviews, please contact:Ewa Magiera, IUCN Media Relations, m +41 79 856 76 26, [email protected] Richard Thomas, Global Communications Co-ordinator, TRAFFIC, m +44 752 6646216, [email protected] About TRAFFICTRAFFIC, the wildlife trade monitoring network, works to ensure that trade in wild plants and animals is not a threat to the conservation of nature. TRAFFIC is a strategic alliance of IUCN and WWF.About IUCNIUCN, International Union for Conservation of Nature, helps the world find pragmatic solutions to our most pressing environment and development challenges. IUCN supports scientific research, manages field projects all over the world, and brings governments, NGOs, the UN and companies together to develop policy, laws and best practice. IUCN is the world’s oldest and largest global environmental organization, with more than 1,000 government and NGO members and almost 11,000 volunteer experts in some 160 countries. IUCN’s work is supported by over 1,000 staff in 60 offices and hundreds of partners in public, NGO and private sectors around the world. About the Species Survival CommissionThe Species Survival Commission (SSC) is the largest of IUCN’s six volunteer commissions with a global membership of around 8,000 experts. SSC advises IUCN and its members on the wide range of technical and scientific aspects of species conservation, and is dedicated to securing a future for biodiversity. SSC has significant input into the international agreements dealing with biodiversity conservation. Tuesday, January 15, 2013 at 14:17 | Share Article View Printer Friendly Version
« Win a trip to South Africa! | Main
| Rhino poaching toll reaches new high » TRAFFIC aims to ensure that trade in wild plants and animals is not a threat to the conservation of nature
We are an international organization, with offices worldwide. You can also find us on the web in: China, India, Japan, Taiwan
TRAFFIC is a strategic alliance of and TRAFFIC, David Attenborough Building, Pembroke Street, Cambridge CB2 3QZ, UNITED KINGDOM
Tel: +44 (0) 1223 277427TRAFFIC International is a UK Registered Charity No. 1076722, Registered Limited Company No. 3785518. Email: [email protected]
Copyright © 2008, TRAFFIC International. All rights reserved. | 科技 |
2016-40/3982/en_head.json.gz/16752 | | Send Link
Trimble Navigation Limited
935 Stewart Drive
1.408.481.8000 phone
Trimble Enters into Definitive Agreement to Acquire TopoSys to Extend its Geospatial Solutions Business SUNNYVALE, Calif., Sep. 26, 2008 — Trimble (NASDAQ: TRMB) today announced that it has entered into a definitive agreement to acquire TopoSys GmbH of Biberach an der Riss, Germany in an all-cash transaction. TopoSys is a leading provider of aerial data collection systems comprised of LiDAR and metric cameras. Closing of the transaction, anticipated to occur in the fourth quarter, is subject to certain closing conditions. Financial terms were not disclosed. TopoSys aerial data collection systems are used by service companies collecting geospatial data by LiDAR and photogrammetry as well as state authorities and municipalities involved in supplying geospatial information. Typical applications include mapping for coastline protection, floodplain control, city modeling, opencast pit mining, and corridor mapping.Over the last two years, TopoSys has evolved from an aerial services business to an aerial data collection systems developer and integrator. The acquisition of TopoSys extends Trimble's portfolio of engineering scale mapping and asset management solutions. TopoSys will also provide a European footprint to support Trimble's aerial mapping customers, who face short flight seasons, with regional support."TopoSys complements the aerial imaging solutions offered by our subsidiaries Applanix and INPHO. As an integrator of the Applanix POSTrack direct georeferencing and flight management system and RolleiMetric aerial industrial camera, TopoSys has developed high productivity aerial data collection systems for engineering scale mapping and asset management," said Ken Spratlin, director for strategy and business development for Trimble. "As part of the Trimble Connected Site solutions, these systems will provide high accuracy as-builts, enabling advanced process and workflow integration from the design phase through construction to the subsequent maintenance phase—delivering significant improvements in productivity.""With Trimble's focus on the geospatial solutions business, joining the company accelerates TopoSys' evolution to an aerial data collection systems developer and integrator," said Svein Vatslid, managing director of TopoSys. "We are pleased to become a part of Trimble."Svein Vatslid and the staff of TopoSys will join Trimble and will be reported as part of the Engineering and Construction segment.About TopoSys GmbHTopoSys, founded in 1995, commercialized the airborne fiber optic laser scanner together with a line camera to obtain information of the topography. Today, TopoSys has customers in service companies as well as state authorities and municipalities who use its aerial data collection solutions and services for coastline protection, floodplain control, city modeling, opencast pit mining, and corridor mapping.For more information, visit: www.toposys.comAbout TrimbleTrimble applies technology to make field and mobile workers in businesses and government significantly more productive. Solutions are focused on applications requiring position or location—including surveying, construction, agriculture, fleet and asset management, public safety and mapping. In addition to utilizing positioning technologies, such as GPS, lasers and optics, Trimble solutions may include software content specific to the needs of the user. Wireless technologies are utilized to deliver the solution to the user and to ensure a tight coupling of the field and the back office. Founded in 1978 and headquartered in Sunnyvale, Calif., Trimble has a worldwide presence with more than 3,600 employees in over 18 countries. Certain statements made in this press release are forward looking statements within the meaning of Section 27A of the Securities Act of 1933 and Section 21E of the Securities Exchange Act of 1934, and are made pursuant to the safe harbor provisions of the Securities Litigation Reform Act of 1995. These statements involve risks and uncertainties, and actual events and results may differ materially from those described in this news release. Factors that could cause or contribute to such differences include, but are not limited to, continued customer use and future market adoption of TopoSys solutions and, more generally, aerial data collection and processing solutions, Trimble's ability to integrate its portfolio of aerial imaging and data collection solutions or to provide compelling solutions that deliver significant improvements in productivity to users, and changing economic trends and competitive pressures in the industry. Additional risks and uncertainties include: the risks inherent in integrating acquisitions; the ability to realize synergies through the integration of TopoSys and its products; unanticipated expenditures, charges or assumed liabilities that may result from the acquisition; and retaining key personnel. More information about potential factors which could affect Trimble's business and financial results is set forth in reports filed with the SEC, including Trimble's quarterly reports on Form 10-Q and its annual report on Form 10-K. All forward looking statements are based on information available to Trimble as of the date hereof, and Trimble assumes no obligation to update such statements.Investor Relations Contact: Willa McManmon of Trimble: 408-481-7838Media Contact: LeaAnn McNabb of Trimble: 408-481-7808 | 科技 |
2016-40/3982/en_head.json.gz/16811 | Science & Technology Simon Wiesenthal Center Documents Online Hate April 12, 2011 8:00 PM
Peter Fedynsky
The Internet in many ways is a virtual extension of the real world, a place where people gather, engage in commerce and acquire knowledge. It is also a place exploited by stalkers, thieves and terrorists. The Simon Wiesenthal Center in New York and Los Angeles has identified thousands of websites, blogs, and social networking pages that it says directs online hate. The New Year’s Day bombing of a Coptic church in Alexandria, Egypt was preceded a month earlier by an online warning of an attack if Catholics did not intervene in the Christian-Muslim dispute in Egypt.And a deadly rampage at a U.S. Army base in Texas in November 2009 was followed by online praise for the alleged gunman by American-born Muslim cleric Anwar al-Awaki, presumably from Yemen."Yes, Nidal Hassan was one of my students and I am proud of that," said Anwar al-Awaki. "I am proud of Nidal Hassan and this was a heroic and wonderful act."Rabbi Abraham Cooper is associate dean of the Wiesenthal Center. He says al-Awaki’s statement encourages others to kill."You had a validation for the hatred," said Rabbi Cooper. "Almost like a cheerleading section for the hatred this man had in Texas, and saying, go ahead - you know what needs to be done."Rabbi Cooper says radicals use the Internet to promote a climate of hate, counting on others to act upon it.Rick Eaton is a Wiesenthal Center researcher in Los Angeles. He took VOA on a virtual tour of a few of some 14,000 Internet hate sites spanning every continent, racial prejudice and ethnic animosity. One, for example, glorifies the white race and Adolph Hitler. Others promote terrorism in Indonesia or discuss holy war in Pakistan. Yet another site encourages racism in children. It features a video game with a virtual fighter-bomber that targets poor black Haitians. Eaton says the Internet allows easy dissemination of hate-related material. His face is not shown to protect his identity."Now with the Internet, these messages are spread and re-spread virally until a single video that a hate group or terrorist group puts up is then recycled into hundreds of different sites," said Rick Eaton.Experts say there is no way of tracing material posted by terrorists to a specific computer. Thomas Leighton, co-founder of Akamai Technologies, explains:"The Internet was developed in an environment where everyone was trusted," said Thomas Leighton. "It was developed in the university and government system where there weren't bad guys."Rabbi Cooper says hate groups can identify one another with the click of a mouse in the same way that others use the Internet for peaceful purposes."For the NGOs, the non-governmental organizations, that are involved in promoting human rights and civil society - we use the same tools," he said. "Who can I go to in Sweden who can help the Roma, or another minority group that’s having problems? Who is taking the lead?"Cooper says police, parents and others must confront digital hate in order to prevent such violent rhetoric from turning into reality.
Peter Fedynsky's video report
More Science & Innovation News Trouble in the Air: Atmospheric CO2 Levels Reach Historic Levels Rosetta Probe Ends 12-year Mission with Crash Into Comet
Four States Sue to Stop Obama Administration’s Internet 'Giveaway'
South Korean Video Game Raises Awareness of Government Surveillance
From Salesman to Forest Defender, Papua New Guinea Activist Wins US Award
Blogs Most Popular Articles | 科技 |
2016-40/3982/en_head.json.gz/16822 | > Science News
» Follow The Post On: More news from: Science | Environment | Health
Scientists create cell based on man-made genetic instructions
By David Brown
Scientists reported Thursday that they have created a cell controlled entirely by man-made genetic instructions -- the latest step toward creating life from scratch. The achievement is a landmark in the emerging field of "synthetic biology," which aims to control the behavior of organisms by manipulating their genes.
Although the ultimate goal of creating artificial organisms is still far off, the experiment points to a future in which microbes could be manufactured with novel functions, such as the ability to digest pollutants or produce fuels. Some ethicists fear that the strategy could also be used to produce biological weapons and other dangerous life forms.
In a paper published online by the journal Science, researchers from the J. Craig Venter Institute described using off-the-shelf chemicals and the DNA sequence of Mycoplasma mycoides's genes to make an artificial copy of the bacterium's genome. The scientists then transplanted that genome into the cell of a different (but closely related) microbe.
The donor genome reprogrammed the recipient cell, which went on to replicate and divide. The result was new colonies of Mycoplasma mycoides.
"We think these are the first synthetic cells that are self-replicating and whose genetic heritage started in the computer. That changes conceptually how I think about life," said J. Craig Venter, 63, who gained fame a decade ago as the co-sequencer of the human genome. His institute has laboratories and offices in Rockville and San Diego.
Other scientists characterize the experiment in less revolutionary terms. They say that only the genome was synthetic; the recipient cell was equipped by nature and billions of years of evolution to make sense of the genes it received and turn them on. Still, they praised Venter's 24-member team for showing that such a transplant was feasible.
"From a technical standpoint, this is clearly a very important advance," said Anthony S. Fauci, director of the National Institute of Allergy and Infectious Diseases at the National Institutes of Health.
"It is a milestone in synthetic biology," said Gregory Stephanopoulos, a professor of chemical and engineering and biotechnology at MIT. "Over the long term, it will have an impact, although over the short term, not so much."
The Venter team stopped short of creating new cells with new functions. Instead, it manufactured a Mycoplasma mycoides genome that was virtually identical to the natural one and used it to make cells that were also nearly indistinguishable from the natural cells.
In that sense, the experiment's success is more symbolic than practical. It is unlikely to have any immediate effect on the biotech world, which for more than two decades has used various methods of recombinant DNA technology to manipulate to manufacture drugs, produce pest-resistant crops and enhance the nutritional value of food.
The development nonetheless engaged the attention of President Obama, who on Thursday asked the Presidential Commission for the Study of Bioethical Issues to "undertake, as its first order of business, a study of the implications of this scientific milestone, as well as other advances that may lie ahead in this field of research."
The early consensus is that Venter's achievement poses no hazards beyond those that exist with current modes of moving or tweaking genes. | 科技 |
2016-40/3982/en_head.json.gz/16966 | Navy delves deeper into undersea robotics
By Henry KenyonDec 14, 2011
By the end of the decade, the Navy plans to deploy squadrons of unmanned underwater robots to survey the ocean. But there are a lot of challenges operating underwater, and the robots will require a great deal of autonomy to carry out search and mapping missions.
That’s the goal of the Adaptive Networks for Threat and Intrusion Detection or Termination (ANTIDOTE) program. Funded by the Office of Naval Research (ONR), ANTIDOTE’s team of scientists from the Massachusetts Institute of Technology and the University of Southern California is developing software-based methods for large teams of robots to perform more sophisticated missions autonomously in dynamic, time-critical environments and with limited communications. A major part of the program focuses on autonomous planning and replanning methods, said Marc Steinberg, ANTIDOTE’s program officer at ONR.
Energy lab’s microscopic robots assemble selves, can move larger objects
The underlying theory behind ANTIDOTE was for persistent surveillance of dynamic environments, regardless of the vehicle type carrying out the mission, Steinberg said. For example, some successful simulation experiments have been conducted with unmanned air systems. Undersea vehicles have very limited communications compared with systems that operate above ground, Steinberg said. This drives a need for autonomy because the robot subs can’t rely on a human operator.
Additionally, there are unique challenges in navigation, mobility and sensing underwater. For example, the undersea glider robots in ANTIDOTE's experiments use changes in buoyancy for propulsion, rather than an active device such as a propeller. This enables them to have an extended endurance, but it also requires that gliders move up and down in depth in a saw-tooth-like pattern, which has a big impact on how to do autonomous planning to maximize the value of the scientific data being collected.“The sea experiments were a great way to examine how some promising theoretical results would work in a real-world situation of practical value to scientists,” Steinberg said. Prior theoretical work had looked at how autonomous vehicles can best perform persistent surveillance in a dynamic environment. In the sea tests, the new software was used to generate paths for the underwater gliders to collect oceanographic data. The method takes into account both user priorities and ocean currents in determining these paths. The experiments, in southern California and in Monterey Bay, Calif., involved a glider using this new capability and a reference glider that followed a more traditional fixed path. Results of the experiment showed that the vehicle using the new method executed two to four times as many sampling profiles in areas of high interest when compared against the unmodified reference glider, while maintaining an overall time constraint for the completion of each circuit of the path, ANTIDOTE researchers said in a statement. Overall, the results validate that the theoretical results can be of value in solving real-world surveillance problems with autonomous systems, Steinberg said. The ANTIDOTE program is near the end of its third year. After that, Steinberg said that it is up to ONR leadership to decide whether to fund it for an additional two years.
“As a fundamental research program, the main products are new theory and methods," he said. "Some of these are being implemented in software and will be available to other researchers via open architectures for robots.”
Henry Kenyon is a contributing writer for Defense Systems. | 科技 |
2016-40/3982/en_head.json.gz/17082 | About Skeptoid
What Is Skepticism?
Aliens & UFOs
Consumer Ripoffs
History & Pseudohistory
Logic & Persuasion
Skeptoid Blog
Skeptalk Email Discussion List
Answering Student Questions
Help with Skeptoid Research
The Rendlesham Forest UFO Did a UFO with flashing colored lights harass a USAF base in England in 1980? by Brian Dunning
Filed under Aliens & UFOs, Conspiracies
Skeptoid Podcast
#135 January 6, 2009
Podcast transcript | Download | Subscribe
http://skeptoid.com/audio/skeptoid-4135.mp3 An F-101C stationed at RAF Bentwaters (Photo credit: USAF) The Sci-Fi Channel calls it the most comprehensive cover-up in the history of Britain. It's often called the most important UFO incident of the 20th century. Imagine, alien spacecraft drifting through the woods on the perimeter of a US Air Force Base in England, shining their colored lights around in plain view of pursuing military security personnel, for three nights in a row. And how did the United Kingdom and the United States react to this obvious threat to their nuclear arsenals? They didn't. There's no wonder the Rendlesham Forest UFO Incident is the one that UFOlogists consider the most frightening.
If you watch the Sci-Fi Channel, the History Channel, the Discovery Channel, or any of the other paranormal TV networks, you've probably heard the popular version of events on those three nights. Here are the significant points:
Two old Royal Air Force airfields, RAF Bentwaters and RAF Woodbridge, are situated just two miles apart near the eastern coast of England. Throughout the cold war they were operated by the United States Air Force. On the night of Christmas Day, December 25, 1980, personnel at the base reported bright UFOs streaking through the sky. Later that night, in the wee hours of December 26, security personnel from RAF Woodbridge entered Rendlesham Forest to investigate some strange, pulsating, colored lights moving through the trees, that they thought at first might be a downed aircraft. Local constables were called and also participated in the observation. Base personnel described the craft they pursued as metal and conical, with a bright red light above and a circle of blue lights below, and suspended in a yellow mist. By daylight, they located a clearing where they thought the strangely lit craft had set down, and found three depressions in the ground in a triangular pattern. The constables were called again and photographed and confirmed the landing site.
Two nights later in the wee hours of December 28, they returned to the site, led by Lt. Colonel Charles Halt, second in command at the base. They brought a radiation detector and recorded high levels of radiation at the landing site, again observed the colored, pulsating lights through the trees, and again pursued them through the forest. Other colored lights were seen flying through the sky. Col. Halt recorded the audio of this pursuit on a microcassette. Two weeks later, after debriefing all of his men who participated, he wrote down the specifics of the episode in a signed memo titled "Unexplained Lights", and sent it in to the British Ministry of Defense. Ever since, the airmen involved claim to have been coerced to change their stories and deny that anything happened, and were threatened with comments like "bullets are cheap."
Wow. That story is really something, isn't it? But even more impressive than the story is the documentation, mainly Col. Halt's audio recording and signed memo. You don't rise to be deputy commander of a United States Air Force base with nuclear weapons if you're a nutcase, and when you're accompanied by local police constables and a number of Air Force security personnel who all file written reports, you don't exactly make up ridiculous stories. There's little doubt that Rendlesham Forest probably has the best, most reliable evidence of any popular UFO story.
Ever since I first heard about the Rendlesham Forest incident, I've been as curious as anyone to know what actually happened. So I decided to begin with the null hypothesis — that nothing extraordinary happened — and then examine each piece of evidence that something extraordinary did happen, individually, on its own merit. I wanted to see if we could find a natural explanation for each piece of evidence: You always have to eliminate terrestrial explanations before you can consider the extraterrestrial.
Let's take it chronologically. The first events were the reported UFO sightings at the base on the night of the 25th and the early hours of the 26th. It turns out that people on the base were not the only people to see this. UFO reports flooded in from all over southern England, as it turned out that night was one of the best on record for dramatic meteors. The first were at 5:20pm and again at 7:20pm over southeastern England. Later at 9pm, the upper stage of a Russian rocket that had launched the Cosmos 749 satellite re-entered and broke up. As reported in the Journal of the British Astronomical Association, 250 people called in and reported a sighting as first six fragments came streaking in, which then broke up into more than 20. Finally, at about 2:45am on the morning of the 26th, a meteor described by witnesses as "bright as the moon" flew overhead with an unusually long duration of 3-4 seconds. The experience of the airmen was described in a letter home written by one of them:
At [about 3am], me and five other guys were walking up a dark path about 2 miles from base... Then we saw a bright light go right over us about 50 feet up and just fly over a field. It was silent.
At the same time on base, a security patrolman was dispatched to check the weapons storage area to see if a "falling star" had hit it. It had not. But it does seem clear that all of the UFO reports from the base are perfectly consistent with known meteor activity on that night. So much for the UFO sightings. Next piece of evidence.
Airmen at the east end of RAF Woodbridge went into the forest to investigate a strange, pulsing, colored light that they suspected might be a downed aircraft. We have the signed statements of the three men who went into the forest, SSgt. Penniston and Airmen Cabansag and Burroughs, as well as that of their superior, Lt. Buran. At this point it's important to know the geography of the area. Heading east from the east gate of RAF Woodbridge, there is about one mile of forest, followed by an open farmer's field several acres in size. At the far end of that field is a farmhouse. A little more than 5 miles beyond that sits the Orfordness lighthouse, in a direct line of sight.
Although the three men stayed together, their reports are dramatically different. Penniston and Burroughs reported moving lights of different colors, that they felt came from a mechanical object with a red light on top and blue lights below surrounded by a yellow haze. They even drew pictures of it in their reports, but Penniston's illustration of their best view of it shows it partially obscured by trees and well off in the distance to the east. Burroughs' drawing of the object is based on Penniston's description, as Burroughs himself only reported seeing lights. Cabansag, however, reported that the only light they saw after actually leaving the base was the one that all three men eventually identified as a lighthouse or beacon beyond the farmhouse. Cabansag reported that the yellow haze had simply been the glow from the farmhouse lights. Once they reached the field, they turned around and returned to base without further incident.
A further problem with Burroughs' and Penniston's stories is that they have grown substantially over time, particularly Penniston's. In more recent TV interviews, they've both claimed that they saw the craft fly up out of the trees and fly around. Penniston has also unveiled a notebook which he claims he wrote during their forest chase, which he displayed on a 2003 Sci-Fi Channel documentary. Its times and dates are wrong, and Burroughs has stated that Penniston did not make any notes during the episode and would not have had time to even if he'd wanted. Penniston's story has also expanded to include a 45 minute personal walkaround inspection of the object during which he took a whole roll of photographs (seized by the the Air Force, of course), which from the written statements of all three men, is a clear fabrication.
Only Cabansag's version of events, that there was a single pulsing light later determined to be a distant beacon or lighthouse, describes events that all three men agreed on, and is consistent with the statements of others. For example, A1C Chris Arnold, who placed the call to the police and waited at the end of the access road, gave this description in a 1997 interview:
There was absolutely nothing in the woods. We could see lights in the distance and it appeared unusual as it was a sweeping light, (we did not know about the lighthouse on the coast at the time). We also saw some strange colored lights in the distance but were unable to determine what they were... Contrary to what some people assert, at the time almost none of us knew there was a lighthouse at Orford Ness. Remember, the vast majority of folks involved were young people, 19, 20, 25 years old. Consequently it wasn't something most of the troops were cognizant of. That's one reason the lights appeared interesting or out of the ordinary to some people.
Police constables responding to Arnold's call of "unusual lights in the sky" did arrive on the scene while Penniston, Cabansag, and Burroughs were still in the forest. Here is the report they filed:
Air traffic control West Drayton checked. No knowledge of aircraft. Reports received of aerial phenomena over southern England during the night. Only lights visible this area was from Orford light house. Search made of area - negative.
So much for unusual lights or strange flying craft reported by the airmen in the forest on the first night.
Next morning, some of the men found what they believed to be site of where Penniston's craft must have touched down. It was a clearing with three depressions in the ground, possibly made by landing pads. Again the police were called. The police report stated:
There were three marks in the area which did not follow a set pattern. The impressions made by the marks were of no depth and could have been made by an animal.
Forestry Commission worker Vince Thurkettle, who lived less than a mile away, was also present at the examination of the landing site. Astronomer Ian Ridpath, who has a fantastic website about the event (and check out this YouTube video of his original BBC report here), interviewed Thurkettle about the impressions and the reported burn marks on the surrounding trees:
He recognized them as rabbit diggings, several months old and covered with a layer of fallen pine needles... The "burn marks" on the trees were axe cuts in the bark, made by the foresters themselves as a sign that the trees were ready to be felled.
So much for the landing site.
It was two nights later that Col. Halt decided to take the investigation into his own hands (contrary to the popular telling that says there were events on three nights in a row, there are no reported events on the second night). Halt properly armed himself with a Geiger counter and an audio recorder (Download the complete 17-minute recording here), and took some men to examine the landing site and the strange lights. It's been reported that Halt found radiation levels at the landing site ten times higher than normal background levels:
Col. Halt: "Up to seven tenths? Or seven units, let's call it, on the point five scale."
Your browser does not support the audio tag.
He used a standard issue AN-PDR 27 Survey Meter, which detects beta and gamma radiation. The highest level reported by Col. Halt on his audio tape, "seven tenths", corresponded to .07 milliroentgens per hour, just at the lowest reading on the bottom range of the meter, the "point five scale". The UK's National Radiological Protection Board (NRPB) told Ian Ridpath that levels between .05 and .1 mR/h were normal background levels; however, this particular meter was designed to measure much higher levels of radiation and so it was "not credible" to establish a level of only ten times normal background. So much for Col. Halt's radiation.
And then they observed the mysterious colored light flashing through the trees:
Col. Halt: "You just saw a light? Where? Slow down. Where?" Unidentified: "Right on this position here. Straight ahead, in between the trees – there it is again. Watch – straight ahead, off my flashlight there, sir. There it is." Col. Halt: "I see it, too. What is it?" Unidentified: "We don’t know, sir." Col. Halt: "It’s a strange, small red light."
Every lighthouse has a published interval at which it flashes. This is how sea captains are able to identify which light they're seeing. The Orfordness lighthouse has an interval of 5 seconds. Now listen to the same exchange again; I've added a beep at exactly five second intervals:
Although several times during the tape Col. Halt calls the light red, he is contradicted by his men who say it's yellow. In photographs of the 1980 light taken before it was replaced, it did indeed look orange. Even the new light, which is mercury vapor discharge and therefore whiter and bluer than the original incandescent, appears distinctly red in photographs and video when viewed from Rendlesham forest.
Col. Halt, having been in the area longer than most of the young servicemen, did know about the lighthouse; but he didn't think this light could be it because it was coming from the east. Col. Halt believed the lighthouse was to the southeast. This is true from RAF Bentwaters, where Halt was from. But the chase through the forest proceeded due east from RAF Woodbridge — two miles south of Bentwaters — and from there, unknown to Col. Halt, Orfordness lighthouse is indeed due east.
Col. Halt: "We've passed the farmer's house and are crossing the next field and now we have multiple sightings of up to five lights with a similar shape and all but they seem to be steady now rather than a pulsating or glow with a red flash."
Five steady lights glowing red. The Orfordness Transmitting Station is just two miles up the coast from the lighthouse, and features five tall radio towers topped with red lights. Col. Halt's thoroughness was commendable, but even he can be mistaken. Without exception, everything he reported on his audiotape and in his written memo has a perfectly rational and unremarkable explanation.
And with that, we're nearly out of evidence to examine. All that remains is the tale that the men were debriefed and ordered never to mention the event, and warned that "bullets are cheap". Well, as we've seen on television, the men all talk quite freely about it, and even Col. Halt says that to this day nobody has ever debriefed him. So this appears to be just another dramatic invention for television, perhaps from one of the men who have expanded their stories over the years.
When you examine each piece of evidence separately on its own merit, you avoid the trap of pattern matching and finding correlations where none exist. The meteors had nothing to do with the lighthouse or the rabbit diggings, but when you hear all three stories told together, it's easy to conclude (as did the airmen) that the light overhead became an alien spacecraft in the forest. Always remember: Separate pieces of poor evidence don't aggregate together into a single piece of good evidence. You can stack cowpies as high as you want, but they won't turn into a bar of gold.
By Brian DunningFollow @BrianDunning Please contact us with any corrections or feedback.
Dunning, B. "The Rendlesham Forest UFO." Skeptoid Podcast. Skeptoid Media,
6 Jan 2009. Web.
1 Oct 2016. <http://skeptoid.com/episodes/4135>
References & Further Reading Army. Radiac Set AN/PDR-27. Washingon, DC: United States Army, 1952. 12.
BBC. "Rendlesham - UFO Hoax." BBC Inside Out. BBC, 30 Jun. 2003. Web. 16 Jan. 2010. <http://www.bbc.co.uk/insideout/east/series3/rendlesham_ufos.shtml>
Kelly, Lynne. The Skeptics Guide to the Paranormal. New York: Thunder's Mouth Press, 2004. Pages 198-201.
Radford, B. "UFO Cover-Up? Britain's Roswell Event Missing from Released Files." Life's Little Mysteries. Tech Media Network, 4 Mar. 2011. Web. 6 Jul. 2012. <http://www.lifeslittlemysteries.com/1172-missing-ufo-files.html>
Ridpath, Ian. "The Rendlesham Forest UFO Case." IanRidpath.com. Ian Ridpath, 1 Jun. 2009. Web. 16 Jan. 2010. <http://www.ianridpath.com/ufo/rendlesham.htm>
Rutkowski, C. A World of UFO's. Toronto: Canada Council for the Arts, 2008. 27-32.
©2016 Skeptoid Media, Inc. All Rights Reserved. Rights and reuse information
The Skeptoid weekly science podcast is a free public service from Skeptoid Media, a 501(c)(3) educational nonprofit.
This show is made possible by financial support from listeners like you. If you like this programming, please become a member.
Make this an automatic recurring monthly donation
(Cancel any time) All donations are tax deductible for U.S. residents.
Email me about new episodes:
Now Trending...
The Rothschild ConspiracyWho Are the Raelians, and Why Are They Naked?The Santa Barbara Simoom of 1859Killing Faith: Deconstructionist ChristiansSolving the Haunted Hoia-Baciu ForestFacts and Fiction of the Schumann ResonanceBinaural Beats: Digital DrugsThe Siberian Hell Sounds
SKEPTOID MEDIA
About us | Our programming | Become a supporter
A STEM-focused 501(c)(3) educational nonprofit.
All content is © Skeptoid Media, Inc. All Rights Reserved.
Get the Skeptoid Companion Email in your inbox every week, and double your dose of Skeptoid: | 科技 |
2016-40/3982/en_head.json.gz/17100 | Cloud Computing | Feature
What the Cloud Can Do for You
By Charlene O’Hanlon, Dian Schaffhauser01/04/12
How can cloud-based services improve school operations? Let us count the ways.
1) Robust Resiliency and Data Recovery
First, let’s start with the resiliency and data recovery attainable through the use of redundant systems and off-site backup. A couple of years ago the only servers residing in the data center for the Saginaw Intermediate School District (SISD) in Michigan were those belonging to the regional service center. Now with a private cloud, the data center is running 80 virtual servers, only 32 of which belong to the district. The remaining servers are accessed by other districts served by the SISD that found advantages in relinquishing physical control of their boxes. The SISD has a generator backup, which the districts didn’t. It also has a mirror site located at one of its larger client districts, which provides off-site redundancy. "That has really caught on like wildfire in a two-year span," says Jeff Johnson, the SISD’s director of technical services.
The migration began when the district put in its first storage area network and began using virtualization to consolidate its servers for better management of services. That led to the ability for the SISD to sell disk storage space to its client districts. Eventually, that shifted to hosting of server operations too.
"By our hosting, they’re not losing anything," Johnson notes. "They’re not losing control, computing power, or any of those things. But they’re gaining all the benefits of our data center redundancy, our generator, and our backup systems. And we’re sharing costs on all of that now instead of everybody trying to do it on their own."
2) Anywhere Computing
Second, there’s "anywhere computing." Users simply open a browser window and start working. That’s how the Minnesota Online High School(MNOHS), which has between 250 and 300 students, operates. According to systems engineer Sarah Carsello, the sole IT person on staff, "We’re completely, 100-percent mobile. You can set up camp in an airport. If you wanted to go on the road and stop off at a rest area--anywhere there’s internet--you can work."
3) No More Equipment Baggage
Third, cloud computing offers a heady dose of freedom regarding equipment and vendors. "We’re an extremely small school. We don’t have the bandwidth or the desire to have our own servers," explains MNOHS Executive Director Ned Zimmerman-Bence. "Everything we do is hosted off-site. Our [learning management system] is hosted with Blackboard. Our student information system is hosted with Infinite Campus. Our SharePoint system is hosted out."
That provides definite operational advantages. "Because we’re not investing in our own hardware, we’re not locking ourselves into a specific operating system or having to constantly upgrade and maintain and patch our equipment," Carsello adds. In the traditional approach, moving from one vendor to another might require writing off software or hardware that’s been purchased or time invested in configuring the system. By choosing cloud-based services, the school can cut its losses more quickly and be selective about its service providers. "If we don’t like them, we can say, ‘Okay, we’re done with you; we’re moving to the next one.’"
4) Effortless Analytics and Monitoring
Fourth, Carsello notes that services delivered by the web from a third-party company can act as a monitor to deliver analytics that are vital to administrators--or simply to find out whether a student is being honest or not. Blackboard Learn, the school’s LMS, which is hosted by the vendor, delivers crucial metrics to ensure that students are doing their work and that they’re "attending" school (albeit online). "If they’re not, they can’t be a student anymore," Carsello states.
Likewise, if a student fusses that he or she can’t finish an assignment because the math software wasn’t accessible, Carsello can log into a utility to see if the student has been pinging the school’s DNS server or not. "We have a couple of different ways to see if they’re telling the truth and really experiencing a technical issue," she says.
5) Simplified Scalability
Fifth, cloud computing offers a measure of simplified scalability to respond to changes in computing demand. The vision is that, as a school expands or shrinks its user base, so too can it crank up or wind down its subscription-based licensing. For example, the year-old Coleman Tech Charter High School in San Diego expects to grow from 50 students in its first year to 480 by the 2014-2015 school year. That growth can easily be handled by the cloud-based systems in use, says Assistant Principal Neil McCurdy.
In fact, he’s so confident that cloud-based services can do the heavy lifting, the school has no intentions of hiring an IT professional. McCurdy, a Ph.D. in computer science, will handle the bulk of IT needs; and what he can’t cover, a student IT team will take care of. That includes helping other students figure out why they can’t access the internet, salvaging data from computers before they’re sent off for repair by swapping out hard drives, and other straightforward forms of troubleshooting.
Where an IT pro might come in handy, McCurdy is creating ways to automate the work. For example, to provision services for new users, a staff person can enter student or staff information into a Google spreadsheet that’s shared by everyone who needs access to it, and that activity "automatically creates the user in the system and gives them their Google Docs account," he says.
6) Freeing Up IT’s Time
Of course, not all of the benefits of cloud computing accrue simply to the organization that’s running it. They also add up for those individuals supporting the IT operations. As MNOHS’s Carsello points out, with cloud computing, "I can sleep at night. I don’t have to have my cell phone on my nightstand to go off to alert me that a server went down or a service is unavailable. Now, because we have everything hosted, I can go on vacation. I can have a holiday and not have to worry about anything breaking. Other people are responsible for it."
Diving Into the Cloud
HomeDefining the CloudKnow Your CloudsWhat the Cloud Can Do For YouWeathering the ThundercloudsMaking the Business Case for the CloudLook Before You LeapBuild a Case for CloudBreaking Through Cloud Cover | 科技 |
2016-40/3982/en_head.json.gz/17155 | → Ben & Jerry’s Co-Founder: Clinton Campaign Asked…
“Body Farm” In Texas Studies Human Decomposition, Is Pretty Metal
Any one of these photos would make a great album cover for your Danish black metal band. A recent forensic anthropology study at Texas State University, using the remains of those who have donated their bodies to science, is turning up important new information on how vultures eat us when we die.
Gavon Laessig
David J. Phillip / AP
The skeletal remains of Patty Robinson are shown at Texas State University’s “body farm,” officially the Forensic Anthropology Research Facility, Thursday, Feb. 9, 2012, in San Marcos, Texas. Robinson donated her body for research at the school. What they’re finding at the research facility debunks some of what they and other experts believed about estimating time of death for a person whose remains are found outdoors and exposed to the environment.
Kate Spradley, an assistant professor at Texas State University, looks over the skeletal remains of Patty Robinson. For more than five weeks, Robinson’s body lay undisturbed in a secluded field. Then a frenzied flock of vultures descended on the corpse and reduced it to a skeleton within hours.
Experienced investigators would normally have interpreted the absence of flesh and the condition of the bones as evidence that the woman had been dead for six months, possibly even a year or more. This recent vulture study has turned that assumption on its head.
“If you say someone did it and you say it was at least a year, could it have been two weeks instead?” said Michelle Hamilton, an assistant professor at the school’s forensic anthropology research facility. “It has larger implications than what we thought initially.”
Scientists set up a motion-sensing camera in the study that captured the vultures jumping up and down on the woman’s body, breaking some of her ribs, which investigators could also misinterpret as trauma suffered during a beating.
In the past, researchers had to rely on pig carcasses for this kind study. “Now that we have this facility and a group of people willing to donate themselves to science like this, we can actually kind of do what needs to be done, because pigs and humans aren’t equal,” Hamilton said.
Patty Robinson was an Austin woman who died of breast cancer in 2009 at age 72. She donated her remains to research, pictured here in the five-acre fenced area.
Her son, James, said the Texas State research seemed like a worthy project.
She’d be delighted “if she could come back and see what she’s been doing,” he said. “All of us are pretty passionate about knowing the truth.”
"Body Farm" In Texas Studies Human Decomposition, Is Pretty Metal
https://www.buzzfeed.com/gavon/body-farm-in-tex...
Any one of these photos would make a great album cover for your Danish black metal band. A recent...
Tagged:body, bone, buzzard, corpse, csi, dead, death, decomposition, farm, forensic, skull, state, texas, university, vulture Facebook Conversations | 科技 |
2016-40/3982/en_head.json.gz/17189 | Public Release: 2-Nov-2011
Link between air pollution and cyclone intensity in Arabian Sea Disruption of wind shear enables stronger storms National Science Foundation
IMAGE: Scientists are working to better understand atmospheric brown clouds and their effects.
view more Credit: Nicolle Rager Fuller, National Science Foundation
Pollution is making Arabian Sea cyclones more intense, according to a study in this week's issue of the journal Nature.
Traditionally, prevailing wind shear patterns prohibit cyclones in the Arabian Sea from becoming major storms. The Nature paper suggests that weakening winds have enabled the formation of stronger cyclones in recent years -- including storms in 2007 and 2010 that were the first recorded storms to enter the Gulf of Oman.
Researchers note that weakening wind patterns during the last 30 years correspond with a buildup of aerosols in the atmosphere over India, which deflect sunlight away from the surface, creating dimming at ground level. This dimming may be responsible for more intense cyclones.
The aerosol buildup creates formations known as atmospheric brown clouds (ABCs) in which smog from diesel emissions, soot and other by-products of biomass burning accumulate and become widespread to a degree significant enough to affect regional climate. A three-kilometer (1.9-mile)-thick brown cloud has been linked to altered rainfall patterns in South Asia, for example. Because of the large-scale dimming by ABCs, they have a mitigating effect on the warming of the ocean in the region that's also associated with greenhouse gas-driven climate changes.
"We're showing that pollution from human activities as simple as burning wood or driving a vehicle with a diesel engine can change these massive atmospheric phenomena in a significant way," said the paper's lead author, Amato Evan of the University of Virginia. "This underscores the importance of getting a handle on emissions in the region."
Historically, the onset of the monsoon season in summer months has produced strong winds in the lower and upper atmosphere that travel in opposite directions, also known as vertical wind shear, which makes formation of cyclones virtually impossible in July and August. Thus despite warm sea surface temperatures, the Arabian Sea averages two or three cyclones per year that tend to form outside the monsoon season, when the wind shear is diminished.
But the scientists found a trend of increasingly strong cyclones in the months immediately preceding monsoon season. A 1998 cyclone that made landfall in Gujarat, India, killed nearly 2,900 people. Cyclone Gonu made a rare landfall in Iran in 2007 and caused more than $4 billion in damage. Cyclone Phet in 2010 struck the coastlines of Pakistan and Oman and caused nearly $2 billion in damage. Gonu produced category 5-strength winds in excess of 250 kilometers per hour (156 miles per hour). Phet was a category 4 storm.
"This study is a striking example of how human actions, on a large enough scale--in this case, massive regional air pollution caused by inefficient fuel combustion--can result in unintended consequences," said Anjuli Bamzai, program director in the National Science Foundation's Division of Atmospheric and Geospace Sciences, which funded the research. "These consequences include highly destructive summer cyclones that were rare or non-existent in this monsoon region 30 or so years ago."
The scientists used findings from direct observations and model studies of ABCs made by Scripps Institution of Oceanography climate and atmospheric scientist Veerabhadran Ramanathan, a paper co-author. Among the findings is that brown clouds inhibited summertime warming of the surface, which caused sea surface temperatures in the northern Arabian Sea to more closely match cooler temperatures closer to the equator. The team modeled the effects of brown clouds on atmospheric and oceanic circulation patterns. They found that ABCs changed the circulation of the atmosphere and reduced the climatological vertical wind shear.
"This study adds a major dimension to a long list of negative effects that brown clouds have, including rainfall reduction, Himalayan glacial melting, significant crop damages and deaths of a million or more people annually," said Ramanathan. "The one silver lining is that the atmospheric concentrations of these pollutants can be reduced drastically and quickly using available technologies."
Those technologies include, says Ramanathan, diesel filters for trucks and buses; two-stroke engines running on LPG fuel; energy-efficient and less polluting cookstoves; and less polluting brick-kilns.
The other co-authors of the study are James P. Kossin of the National Climatic Data Center and the NOAA Cooperative Institute for Meteorological Satellite Studies, and Chul "Eddy" Chung of the Gwangju Institute of Science and Technology in South Korea. The NOAA Climate Program Office provided additional support for the study.
Cheryl Dybas
[email protected]
@NSF
http://www.nsf.gov More on this News Release
Link between air pollution and cyclone intensity in Arabian Sea National Science Foundation
Scientists are Working to Better Understand Atmospheric Brown Clouds and Their Effects (IMAGE)
view more Genesis and Tracks of Pre-monsoon Tropical Cyclones from 1979-2010 (IMAGE)
view more Indian women Often use Dung or Wood to Fuel Cooking Stoves, Contributors to Brown Clouds (IMAGE) | 科技 |
2016-40/3982/en_head.json.gz/17238 | Time Out For the Macintosh Boogie
@TMOBryan · +Bryan Chaffin Jan 19th, 2009 5:30 PM EST | News
Duane's iMac Decorated Bass Duane Straub, Mac IT expert, member of the Macworld All Star Band, and Director of the Root Boy Slim Memorial Fan Club, has posted a song called the Macintosh Boogie. It being a holiday today (Martin Luther King Day for those finding this in later days), it seems a good time to post a link to it. Mr. Straub wrote the tune way back in 2000 following a time when he tried to convince the Reverend Billy C. Wirtz (a boogie-woogie performer) that writing a song about Macs was a surefire path to bringing him new fans. The Reverend didn't buy it, and so Mr. Straub decided that if someone were going to write a boogie-woogie song about Macs, he'd have to do it himself. On the Web site he created for the song, Mr. Straub wrote, "I can't get anybody on the boat with me - but, hey, I'm undeterred - what Mac person has NOT experienced times when they just can't get people on the boat?!?" Thus the Macintosh Boogie was born, or almost born. He wrote the tune in 2000, but it took until 2003 before he was able to record it -- on a Mac with GarageBand, of course. From there, it was a mere five years for him to release the tune on the Internet for the edification of Mac and boogie fans everywhere. We should note that the song also represents a snap shot in time when it comes to the Mac community. Microsoft slams, the Mac Marines (remember them?), PowerBooks, and the lack of Macs in Enterprise are some of the themes Mr. Straub tapped for the song.
3731Bryan Chaffinnews01232404248 | 科技 |
2016-40/3982/en_head.json.gz/17250 | Jason Boehm Program Coordination Office - HQ [email protected] (301) 975-8678 Dr. Boehm joined NIST in September 2006 as a policy analyst in the Program Office in the Office of the Director at the National Institute of Standards and Technology. As a policy analyst Dr. Boehm was responsible for providing objective analysis and evaluation to the Director of NIST on a portfolio of issues related to the biological sciences, homeland security, and programs that enhance innovation and competitiveness, in support of NIST strategic planning and budget development. Dr. Boehm came to NIST from the Office of Science and Technology Policy (OSTP), Executive Office of the President, where he was responsible for consultation, analysis, and policy development regarding science and technology related to multiple issues of homeland and national security including the development of medical and non-medical countermeasures against WMD, domestic nuclear defense, engineered threats and emerging infectious diseases, and biological and chemical agent decontamination, nuclear defense and detection, international collaborations on homeland security-related S&T, and a number of other issues. Dr. Boehm originally joined OSTP as a AAAS/NTI Fellow in Global Security, an award that provided him the opportunity to work anywhere within the U.S. government on issues related to biological terrorism. Prior to joining the federal government Dr. Boehm was involved in cancer research at Cornell University, where he led a team of researchers studying the role of the cellular protein tissue transglutaminase in cell survival and tumorigenesis. Dr. Boehm received his Ph.D. in 2000 from the University of Nebraska Medical Center, Eppley Institute for Cancer Research, where he studied the role of receptor tyrosine kinase signaling in cell survival. | 科技 |
2016-40/3982/en_head.json.gz/17288 | Dramatic decline found in Siberian tigers
The last remaining population of Siberian tigers has likely declined significantly due to the rising tide of poaching and habitat loss, according to a new report
This is a Siberian tiger photographed in the Russian Far East. A new report released today shows a dramatic decline in Russian tigers due to poaching and habitat loss.
Credit: Dale Miquelle/Wildlife Conservation Society
The Wildlife Conservation Society (WCS) has announced a report revealing that the last remaining population of Siberian tigers has likely declined significantly due to the rising tide of poaching and habitat loss.
WCS says the report will help inform Russian officials of what needs to be done to protect remaining populations of the world's biggest cat.
The report was released by the Siberian Tiger Monitoring Program, which is coordinated by WCS in association with Russian governmental and non-governmental organizations. It revealed that a recent tiger survey over a representative part of the tiger's range showed a 40 percent decline in numbers from a 12-year average.
Annual tiger surveys are conducted at 16 monitoring sites scattered across tiger range to act as an early warning system to detect changes in the tiger population. The monitoring area, which covers 9,000 square miles (23,555 square kilometers), represents 15-18 percent of the existing tiger habitat in Russia. Only 56 tigers were counted at these monitoring sites. Deep snows this past winter may have forced tigers to reduce the amount they traveled, making them less detectable, but the report notes a 4-year trend of decreasing numbers of tigers.
The total number of Siberian tigers across their entire range was estimated at approximately 500 individuals in 2005, having recovered from less than 30 animals in the late 1940s.
"The sobering results are a wake-up call that current conservation efforts are not going far enough to protect Siberian tigers," said Dr. Dale Miquelle, of the Wildlife Conservation Society's Russian Far East Program. "The good news is that we believe this trend can be reversed if immediate action is taken."
"Working with our Russian partners we are hopeful and confident that we can save the Siberian tiger," said Dr. John G. Robinson, WCS Executive Vice President for Conservation and Science. "The Siberian tiger is a living symbol for the people of Russia."
The authors of the report say the decline is due primarily to increased poaching of both tigers and their prey species in the region, coupled with a series of reforms in Russia, which reduced the number of enforcement personnel in key tiger areas.
Russian scientists and non-government organizations are recommending changes in law enforcement regulations, improvements in habitat protection, and a strengthening of the protected areas network to reverse the downward trend.
"While the results are indeed bad news in the short term, we believe the overall picture for Siberian tigers remains positive," said Colin Poole, director of Asia Programs for the Wildlife Conservation Society. "There is an enormous amount of good will for saving Siberian tigers. We just need to translate this into action."
WCS's conservation work in this region has been generously supported by: 21st Century Tiger, E. Lisk Wyckoff, Jr. and the Homeland Foundation; Save The Tiger Fund -- a partnership of the National Fish and Wildlife Foundation and the ExxonMobil Foundation; US Fish and Wildife Foundation; Robertson Foundation; Panthera; and the Liz Claiborne and Art Ortenberg Foundation.
Materials provided by Wildlife Conservation Society. Note: Content may be edited for style and length.
Wildlife Conservation Society. "Dramatic decline found in Siberian tigers." ScienceDaily. ScienceDaily, 24 November 2009. <www.sciencedaily.com/releases/2009/11/091124121429.htm>.
Wildlife Conservation Society. (2009, November 24). Dramatic decline found in Siberian tigers. ScienceDaily. Retrieved October 1, 2016 from www.sciencedaily.com/releases/2009/11/091124121429.htm
Wildlife Conservation Society. "Dramatic decline found in Siberian tigers." ScienceDaily. www.sciencedaily.com/releases/2009/11/091124121429.htm (accessed October 1, 2016).
Tide pool
Mudskippers
Distemper Virus Affects Wild Carnivores of All Stripes
Feb. 26, 2015 Tigers, lions and other wild carnivores, already under threat from poaching and habitat loss, are falling victim to Canine distemper, and could soon drive endangered populations to ... read more Public Attitude Toward Tiger Farming, Tiger Conservation
Jan. 20, 2015 The wild tiger Panthera tigris is considered critically endangered, and it faces unprecedented threats, including habitat loss and fragmentation, depletion of prey, and continued illegal poaching for ... read more Tigers Roar Back: Great News for Big Cats in Key Areas
Dec. 26, 2012 Biologists have reported significant progress for tigers in three key landscapes across the big cat’s range due to better law enforcement, protection of habitat, and strong government ... read more Volunteers Clear Tiger Snares in China
Jan. 31, 2012 Volunteers working in northeast China have cleared 162 illegal wire snares in an ongoing effort to protect the nation’s remaining population of critically endangered Amur (Siberian) ... read more Strange & Offbeat | 科技 |
2016-40/3982/en_head.json.gz/17290 | Optical fibers made from common materials
Researchers are taking common materials to uncommon places by transforming easily obtainable and affordable materials into fiber.
Clemson University researchers say sapphire is better for fiber optics than silica used now.
Credit: John Ballato, Clemson University
Clemson researchers are taking common materials to uncommon places by transforming easily obtainable and affordable materials into fiber. Their findings are published in Nature Photonics.
"We have used a highly purified version of beach sand (silica) for fiber for the last 40 years," said John Ballato, director of the Center for Optical Materials Science and Engineering Technologies at Clemson University. "As a matter of fact, the 2009 Nobel Prize in Physics was awarded for the development of silica optical fibers. However, while silica has done remarkably well over time, it is now being pushed to its limits for faster and cheaper data and new functionality."
It has gotten to the point where there is so much light packed in fiber cable that the silica material essentially can't handle the intensity and has actually begun interacting and rebelling.
"At high power, the light causes the atoms of the material to vibrate more violently and those vibrations convert some of the light energy into sound energy which restricts the ability of the fiber to carry more power," said Ballato. "This, in turn, lessens the amount of light that can travel through the fiber, which limits the amount of information that can be sent for telecommunications uses and power for high-energy laser applications,"
The demand for stronger and more durable fiber material is greater than ever and will only increase with technological advancement. Clemson researchers are focusing on providing a material solution for fiber optics, especially one that can be sold commercially. Their goal is to take a robust, affordable, and easily accessible material that can take the brunt of greater intensity and convert that material into a fiber.
Ballato and his team found that sapphire possesses extraordinary properties that make it exceptionally valuable for high power lasers in which the light intensity interacts with sound waves in the glass and leads to diminished power-handling capabilities.
"Sapphire is new and different in this sense because we're able to use a low-cost and widely used commodity as a fiber," said Ballato. "Sapphire is scalable, acceptable and is a material that people don't think about when it comes to fiber optics. The problem is that sapphire's crystalline structure is not amenable to making into optical fiber using commercially accepted methods."
Ballato actually developed the sapphire fiber to withstand greater intensity and be more useful for high-energy applications than typical commercial fibers.
"Ballato's recent results with sapphire fibers represent a paradigm-shifting development in the field of fiber optics," said Siddarth Ramachandran, associate professor in the electrical and computer engineering at Boston University and an expert in the field. "Materials long considered to be used only in the realm of free-space optics can now be exploited in fiber geometries, which enable long interaction lengths and novel nonlinear optical effects."
"This research is paving the way for everyday commodities to be imagined for technological uses such as fiber optics," Ballato said. "We're performing additional studies with sapphire and other materials that have similar effects for fiber."
Materials provided by Clemson University. Note: Content may be edited for style and length.
P. Dragic, T. Hawkins, P. Foy, S. Morris, J. Ballato. Sapphire-derived all-glass optical fibres. Nature Photonics, 2012; DOI: 10.1038/nphoton.2012.182
Clemson University. "Optical fibers made from common materials." ScienceDaily. ScienceDaily, 13 August 2012. <www.sciencedaily.com/releases/2012/08/120813103359.htm>.
Clemson University. (2012, August 13). Optical fibers made from common materials. ScienceDaily. Retrieved October 1, 2016 from www.sciencedaily.com/releases/2012/08/120813103359.htm
Clemson University. "Optical fibers made from common materials." ScienceDaily. www.sciencedaily.com/releases/2012/08/120813103359.htm (accessed October 1, 2016).
Triboelectric effect
Hygroscopy
Recent Advances in Optical Materials
Mar. 21, 2016 Scientists have reviewed recent progress in advanced optical materials based on gadolinium aluminate garnet (GAG), while pointing out the knowledge gaps that need to be filled to improve their ... read more Vehicle Body Made from Cotton, Hemp, and Wood
Jan. 29, 2015 Carbon and glass fibers reinforce synthetics so that they can be used for vehicle body construction. But in this regard, there is an abundance of potential found in natural fibers -- obtained from ... read more Researchers Create Novel Optical Fibers
Apr. 16, 2013 Researchers have found a novel way to propagate multiple beams of light in a single strand of optical fiber. The discovery could increase the amount of information fiber optic cables can ... read more Photonics: Sensing on the Way
Aug. 2, 2012 Hollow optical fibers containing light-emitting liquids hold big promises for biological sensing ... read more Strange & Offbeat | 科技 |
2016-40/3982/en_head.json.gz/17302 | GPS That Never Fails
A breakthrough in vision processing provides a highly accurate way to fill gaps in Global Positioning System availability.
by Tom Mashberg
Drive down a Manhattan street, and your car’s navigation system will blink in and out of service. That’s because the Global Positioning System (GPS) satellite signals used by car navigation systems and other technologies get blocked by buildings. GPS also doesn’t work well indoors, inside tunnels and subway systems, or in caves–a problem for everyone from emergency workers to soldiers. But in a recent advance that has not yet been published, researchers at Sarnoff, in Princeton, NJ, say their prototype technology–which uses advanced processing of stereo video images to fill in GPS gaps–can maintain location accuracy to within one meter after a half-kilometer of moving through so-called GPS-denied environments.
That kind of resolution is a major advance in the field, giving GPS-like accuracy over distances relevant to intermittent service gaps that might be encountered in urban combat or downtown driving. “This is a general research problem in computer vision, but nobody has gotten the kind of accuracy we are getting,” says Rakesh Kumar, a computer scientist at Sarnoff. The work was based partly on earlier research done at Sarnoff by David Nister, a computer scientist now at the University of Kentucky.Motilal Agrawal, a computer scientist at SRI International, in Menlo Park, CA, which is also developing GPS-denied location technologies, agrees, saying the advance essentially represents a five-fold leap in accuracy. “We haven’t seen that reported error rate before,” Agrawal says. “That’s pretty damned good. For us a meter of error is typical over 100 meters–and to get that over 500 meters is remarkable and quite good.”The approach uses four small cameras, which might eventually be affixed on a soldier’s helmet or on the bumper of a car. Two cameras face forward, and two face backward. When GPS signals fade out, the technology computes location in 3-D space by making calculations from the objects that pass through its 2-D field of view as the camera moves. This is a multi-step task. The technology first infers distance traveled by computing how a series of fixed objects “move” in relation to the camera image. Then it adds up these small movements to compute the total distance. But since adding up many small motions can introduce errors over time–a problem called “drift”–the software identifies landmarks, and finds the same landmarks in subsequent frames to correct this drift. This part of the technology is called “visual odometry.” Finally, the technology discerns which objects are moving and filters them out to avoid throwing off the calculations. It works even in challenging, cluttered environments, says Kumar.“The essential method is like how people navigate,” says Kumar. “When people close their eyes while walking, they will swerve to the left or right. You use your eyesight to know whether you are going straight or turning. Then you use eyesight to recognize landmarks.”While the general idea has been pursued for years, Sarnoff has achieved the one-meter accuracy milestone only in the past three months–an advance that will be published soon, says Kumar. It’s an important advance, says Frank Dellaert, a computer scientist at Georgia Tech. “This is significant,” he says. “The reason is that adding these velocities over time accumulates error, and getting this type of accuracy over such a distance means that the ‘visual odometry’ component of their system is very high quality.” Kumar says the technology also allows users–whether soldiers, robots, or, eventually, drivers–to build accurate maps of where they have been, and also to communicate with one another to build a common picture of their relative locations. Kurt Konolige, Agrawal’s research partner at SRI, of which Sarnoff is a subsidiary, says one goal is to reduce the computational horsepower needed to do such intensive processing of video images–something Kumar’s group is working on. But if the size and cost could be made low enough, he says, “you could also imagine small devices that people could wear as they moved around a city, say, or inside a large building, that would keep track of their position and guide them to locations.”The technology, funded by the Office of Naval Research (ONR), is being tested by military units for use in urban combat. Dylan Schmorrow, the ONR program manager, says, “Sarnoff’s work is unique and important because their technology adds a relatively low cost method of making visual landmarks with ordinary cameras to allow other sensors to work more accurately.” Kumar says that while the first priority is to deliver mature versions of the technology to Sarnoff’s military sponsors, the next step will be to try to produce a version that can work in the automotive industry. He says the technology has not yet been presented to car companies, but “we plan to do that.” He adds that the biggest ultimate commercial application would be to beef up car navigation systems. Tom Mashberg | 科技 |
2016-40/3982/en_head.json.gz/17303 | Oil from Wood
Startup Kior has developed a process for creating “biocrude” directly from biomass.
by Erika Jonietz
Dutch biofuels startup Bioecon and Khosla Ventures have launched a joint venture called Kior, which will commercialize Bioecon’s process for converting agricultural waste directly into “biocrude,” a mixture of small hydrocarbon molecules that can be processed into fuels such as gasoline or diesel in existing oil refineries. The process, Kior claims, boasts numerous advantages over other methods of producing biofuels: it could prove relatively cheap, relies on a nontoxic catalyst, taps into the present fuel-refining and transportation infrastructure, and produces clean-burning fuels that can be used in existing engines.
Fossil free: Biofuels startup Kior says that its new thermochemical process can produce “biocrude,” a hydrocarbon mixture that closely resembles crude oil, from almost any agricultural by-product, including the stems and stalks of corn plants (shown here).
Biofuels are widely seen as a key stepping-stone on the path from fossil fuels to renewable energy sources, particularly for transportation. Their use could also reduce emissions of carbon dioxide and other greenhouse gases. But ethanol, the most widely produced biofuel, contains little energy compared with gasoline or diesel. And a great deal of energy goes into its production: growing the grain from which it is fermented, distilling it, and transporting it. Many biofuels boosters have pinned their hopes on finding ways to produce ethanol from cellulose, the tough polymer that makes up much of plant stems and wood. In practice, though, cellulose must be broken down into simple sugars before it can be fermented into ethanol or converted into synthetic gas and turned into fuels. Despite three decades of research, these remain difficult, expensive, and energy-intensive processes that are not yet commercially viable. Additionally, recent research shows that ethanol, which is highly volatile, may actually exacerbate smog problems when it evaporates directly into the air instead of burning in vehicle engines. The way to make cellulosic biofuels viable, says Bioecon’s founder, Paul O’Connor, is to use catalysts to convert biomass into a hydrocarbon biocrude that can be processed into gasoline and diesel in existing petroleum refineries. After decades developing catalysts for the petroleum industry, O’Connor started Bioecon in early 2006 to develop methods for converting biomass directly into biofuels. His first success is a catalytic process that can convert cellulosic biomass into short-chain hydrocarbons about six to thirteen carbon atoms long. Khosla Ventures agreed to provide an undisclosed amount of series A funding to spinoff Kior in order to commercialize the process. Vinod Khosla, founder of the venture fund, believes that converting biomass into liquid transportation fuels is key to decreasing greenhouse-gas emissions and compensating for dwindling petroleum reserves. Khosla is funding a number of biofuels startups with competing technologies and says that Kior’s approach is unique. “They have some very clever proprietary catalytic approaches that are pretty compelling,” he says. “They can produce relatively cheap crude oil–that’s attractive.” The most effective method of converting biomass into fuel is to subject it to high temperatures and high pressure to produce synthetic gas, or syngas. In the presence of a catalyst, the syngas reacts to produce fuels such as ethanol or methanol (used as an additive in biodiesel). But this is a costly process, and catalysts able to withstand the high temperature of the syngas are expensive and frequently toxic.Attempts to produce fuel by directly exposing agricultural cellulose to a catalyst have had little success because most of the cellulose is trapped inside plant stems and stalks. O’Connor says that while the Bioecon researchers are developing new catalysts, their “biomass cracking” process is the real breakthrough. Using proprietary methods, they have been able to insert a catalyst inside the structure of the biomass, improving the contact between the materials and increasing the efficiency of the process. While O’Connor won’t go into details, he says that the most basic version of the technique might involve impregnating the biomass with a solution containing the catalyst; the catalyst would then be recrystallized. “What we’re doing now is improving the method to make it easier and cheaper,” O’Connor says.
Such a method would eliminate the need for the superhigh temperatures and toxic catalysts used in other thermochemical methods for cellulosic-biofuel production. While O’Connor says that he is still improving Kior’s catalyst, his first versions are different kinds of modified clays, which are both cheap and environmentally friendly. The product is high quality as well, containing less acid, oxygen, and water. These characteristics make it suitable for burning as heating oil or for use in petroleum refineries, which can use existing processes and equipment to convert it into the longer hydrocarbon chains of gasoline and diesel fuel. Bioecon has produced lab-scale quantities of its biocrude, a few grams at a time, from materials such as wood shavings, sugarcane waste, and various grasses. While the input material affects the yield somewhat, O’Connor says that the output is “all very similar, so we do not have a real preference.” This means that the process can work around the world, with whatever biomass is locally available, almost year-round. Kior is already in talks with at least two oil companies to establish partnerships to further develop the technology. It is starting a pilot plant with one company that should produce around 20 kilograms of biocrude a day within six to twelve months, says Kior CEO Rob van der Meij. If all goes well, the process could scale up to production of hundreds of kilos per day by 2009, and refined versions of Kior’s biocrude might be blended into gasoline or diesel by 2010. In addition to being renewable, these fuels would have lower sulfur and nitrogen content, which should decrease smog in cities such as Los Angeles and Houston. Because of its ability to slide into the existing petroleum refining and delivery infrastructure, the technology has a huge cost advantage, says O’Connor. It could also be adopted much more rapidly, according to Khosla. “If you can do a solution that’s compatible with the oil companies and their current refineries, it becomes much easier for them to get comfortable with it,” he says. “Getting them into the game would be a big addition.” Steve Deutch, a senior research scientist at the National Renewable Energy Laboratory, says that the little information Kior has released about its process is plausible enough, but that until the details are available, the company’s claims are “not really possible to evaluate.” The main challenge for Kior, or anyone working on cellulosic fuels, Deutch says, is to develop a process simple enough to bring close to the sources of biomass–farms. “Collecting biomass and getting enough of it in one place to make a difference is a problem in the biomass world,” Deutch says. “Trucking costs can become exorbitant. You want to preprocess it at the farm and then ship a high-density, high-energy intermediate to processing plants.”
Bob Allan, NREL
Erika Jonietz | 科技 |
2016-40/3982/en_head.json.gz/17401 | Ars Technica UK Staff —
New Ars feature: Media Reviews
We're launching a monthly series of book and film reviews here at Ars... and …
We've got some exciting changes planned for Ars in 2008, and we're pleased to announce one of them now: Ars Media Reviews. Over the years, we've occasionally reviewed books and movies, but in 2008, we'll start doing it in a systematic way—and we'd like your help. Once each month, we'll cover a title of interest to the tech community in the patented Ars style: in-depth, with plenty of analysis and context. We'll go beyond the standard review format by including an interview with the author/creator, too, along with reactions from our readers. From there, we'll turn it over to the community to hash out the issues raised by the piece in the discussion forum. We'll be announcing each month's title in advance, a process that should lead to plenty of intelligent discussion. Our first book will be The Very Best of Technology Writing 2007. It's a fitting title to launch our review series since it lends itself perfectly to discussing the practice and process of writing about technology, something that obviously concerns us (as writers) and you (as readers). I'll be talking with book editor Steven Levy, who is also a senior editor at Newsweek, for the review. Here's what we need from you to make this project even better: 1) Suggestions for book and film titles. We want to cover what's interesting to you; while the Ars editorial staff has already started drawing up our list for this year, we'd love community input. Leave ideas in the comments or email me directly. Titles need be worth extended discussion, so collections of tips (for instance) would probably be out of bounds. Documentaries on people who play competitive Donkey Kong? Fair game. 2) Reactions to the book. We'll be running the first review in mid-February. If you read the book before then, feel free to drop me a one-paragraph comment, question, or concern about any issue raised by the book; we'll include the best ones in the review itself, and any good questions I'll make sure to ask of Levy. Thanks for reading. Nate Anderson | 科技 |
2016-40/3982/en_head.json.gz/17444 | Bill Totten's Weblog
Scary new best friend for politicians
Peak Oil in Australiaby Paddy ManningSydney Morning Herald (November 27 2010)HERE'S cold comfort. It would be impossible, according to the Swedish energy expert Kjell Aleklett, for us to emit enough greenhouse gas to warm the planet by six degrees: we don't have enough oil, coal or gas to burn."All the emissions scenarios that have been put forward over the last ten years are wrong", says Aleklett, professor of physics at the University of Uppsala and the world president of the Association for the Study of Peak Oil.The UN's Intergovernmental Panel on Climate Change business as usual forecast to 2100, which would result in six degrees of warming, assumes worldwide production of coal could rise ten times higher than today."That can never happen", says Aleklett, who is on an Australian speaking tour this month and was recently heard on the ABC's Science Show.Aleklett says coal production will peak about 2030, and China is peaking about now."Ninety per cent of all coal reserves in the world can be found in six countries: the US, India, China, Russia, South Africa and, of course, Australia."The whole carbon dioxide emissions problem is only six countries", he says. "Those are the drug dealers when it comes to selling coal. If these six countries would stop selling coal there'd be no problem at all."Aleklett has been working with a Newcastle University team studying peak fossil fuel production, led by Geoffrey Evans and culminating in a doctoral thesis by the engineer Steve Mohr, summarised on the Oil Drum website (and previewed here last year). Taking into account supply and demand, Mohr's "best guess" puts peak oil production in 2011-12, peak coal production by 2019 and peak gas production between 2028 and 2047.Aleklett says the Newcastle team is doing "a great job ... we get the same numbers". In the March issue of Energy Policy , he said oil production peaked in 2008. "We know that 2009 was lower, and it looks as if 2010 will be lower again", he says.The concept of a peak - maximum production - does not mean a date after which the world soon runs out of fossil fuels. It's about flow rates. "We are not running out", Aleklett says. "But we have a limit to supply. What we're running out of is the possibility of increased usage..Aleklett, a long-time critic of the International Energy Agency's forecasts, is looking increasingly on-the-money as demand figures have been wound back in the agency's 2010 World Energy Outlook, released this month.It stopped short of calling a peak in oil production but did lower its consumption forecasts.Last year it stressed the importance of oil for economic growth and concluded that 106 million barrels a day would be required by 2030, twenty million higher than today.In 2010 the IEA predicts only 99 million barrels a day by 2035 and avoids any discussion of economic growth. "We can interpret this as meaning desired economic growth is not possible", Aleklett says. He means it.Aleklett believes high oil prices - they peaked at $US147 in 2008 - helped trigger the global financial crisis, when oil-dependent home owners in the outer suburbs of America's cities began defaulting on their loans.Peak oil will limit economic growth: the IEA now sees oil consumption in Organisation for Economic Co-operation and Development countries falling by fifteen per cent by 2035. OECD nations, including Australia, will have to revise down their future consumption estimates.The Australian Bureau of Agricultural and Resource Economics has yet to bring its forecasts into line with those of the IEA. But Australia's own oil production is declining rapidly; by the end of the decade we could be reliant on imports for eighty per cent of our consumption, says the former Shell Australia and Australian Coal Association executive Ian Dunlop. Where will that oil come from? Can we outbid the likes of China and India?"The government's assumption there will always be oil available on the market is complete nonsense", he says. "Unless we're willing to pay a fortune".BHP Billiton's latest quarterly crude oil production figures showed drops across the board, propped up only by the new Pyrenees field in Western Australia.ABARE's Energy Resource Assessment last year listed just two other new Australian oil projects, including the Thai operator PTTEP's Montara/Skua field - the site of last year's disastrous spill.The Australian spokesman for the Association for the Study of Peak Oil, Phil Hart, says ABARE is in "economic fairyland", believing demand will always lead to supply. The same problem underpins the Intergovernmental Panel on Climate Change scenarios, which are modelled by economists who "assume business as usual is possible".Ultimately it is irrelevant whether we have enough fossil fuels to hit six degrees of warming. Global temperatures are already too high at 0.8 degrees above pre-industrial levels, and climate change is already dangerous.Recent Potsdam Institute analysis concluded 75 per cent of the world's fossil fuel reserves must be left in the ground if we are to keep warming to two degrees. "There's more than enough coal, oil and gas to get us into trouble", says Hart. We need to stop burning fossil fuels now (and, the implication is, avoid wasting billions on carbon capture and storage, which uses extra energy and will only accelerate resource depletion).The ramifications are profound: we need to electrify transport, and shift from private to public transport. Aviation may be a sunset industry. No more toll roads with wildly optimistic demand projections. No more hideously expensive airports.Does the federal government get it? Aleklett says politicians should welcome the concept of peak oil, which is physical reality: "Peak oil should be politicians' best friend. It is something they cannot fix."_____The same article is also published in Melbourne's newpaper The Age (same publisher as Sydney Morning Herald), but with a different title - Emissions scenarios are based on flawed assumptions, says energy expert.http://aleklett.wordpress.com/2010/11/27/scary-new-best-friend-for-politicians-peak-oil-in-australia/Bill Totten http://www.ashisuto.co.jp/english/
posted by Bill Totten at 10:01 PM 0 Comments:
Bill Totten
When Oil Peaked
Declining Energy Quality ...
The Economy is Set to Starve (2 of 2)
The Stench of US Economic Decay
Will Internet censorship bill ...
US Government seizure of the internet has begun
Where the Black Swan Dwells
No Time for Lullabies
The Endless Thanksgiving
!Click to join BillTottenWeblog on Yahoo!
!Click to join BillTottenWeblog on Google | 科技 |
2016-40/3982/en_head.json.gz/17445 | UK, India form bioenergy research collaboration
| November 21, 2011Research teams in the U.K. and India will now work together to overcome three areas responsible for slowing the growth of biobased energy. Using a $10 million grant provided from the UK Biotechnology and Biological Sciences Research Council (BBRSC) and the Indian Government Department of Biotechnology (DBT), the BBRSC says U.K. and Indian researchers will work to address the following bioenergy hurdles:
- The identification, characterization and improvement of new enzymes for processing plants like algae.
- Microbe and systems development based on synthetic biology that can produce advanced biofuels from woody biomass or algae.
- Genomic alteration to improve algae strains, making those strains more suitable for biofuel production.
David Willetts, Minister of State for Universities and Science, called the collaboration “vital,” pointing out that using the expertise of both countries will help further develop bioenergy-based alternatives to fossil fuel use in both countries. The collaboration between the U.K. and India is the result of an October bioenergy workshop held in New Delhi, and as part of the collaboration any research effort funded by the joint-venture must have a researcher from both countries.
Although plants have the ability to provide alternatives to fossil fuels in the forms of transportation energy, biobased chemicals and plastics, Douglas Kell, BBRSC chief executive, says, “Doing this (creating bioenergy from plants) sustainably, ethically and economically will require the application of scientific expertise from around the world,” adding that “India and the U.K. are countries with strong and complementary research communities and it is exciting to be able to harness the skills of researchers from both nations to address these important global challenges.”
In October, the BBSRC opened a research center at the University of Nottingham that will allow researchers to study biofuels as well as the food and drink industry. The facility will be one of six BBSRC-led bioenergy centers that will work on biofuels from both industrial and agricultural waste materials. Related ArticlesSustane Technologies to produce pellets from MSW in Canada Global Bioenergies, Clariant produce isobutene from wheat strawResearchers develop method to separate lignin from wood EIA’s Annual Energy Outlook 2016 addresses bioenergy Ginkgo Bioworks, Genomatica forge alliance SEC charges Mard, formerly KiOR, for disclosure omissions 0 Responses Leave a Reply | 科技 |
2016-40/3982/en_head.json.gz/17490 | Tag: SOHO
As promised: Jupiter and moons seen by SOHO
By Phil Plait | May 17, 2012 1:56 pm A little while back, I wrote about Jupiter appearing in an image from NASA’s SOHO Sun-observing satellite. I promised that it would soon appear in a SOHO camera that had higher magnification, and we’d be able to see its moons.
I am not one to break promises:
Awesome. It helps to set the resolution to 720p to see the moons when they’re pointed out.
And just you wait: in early June, Venus will appear in the LASCO C3 and C2 cameras, on its way for a date transiting the Sun for the last time in over a century. I’ll have more about that event in a few days… I promise!
Tip o’ the occulting bar to SungrazerComets on Twitter.
– Jupiter, acting all superior
– Lovejoy lives!
– The Sun fries a comet and we got to watch
– The Galilean Revolution, 400 years later
CATEGORIZED UNDER: Astronomy, Cool stuff, Pretty pictures MORE ABOUT: Callisto, Ganymede, Jupiter, SOHO, Sun Jupiter, acting all superior
By Phil Plait | May 6, 2012 7:00 am This is a cool picture:
What you’re seeing is from the NASA/ESA satellite Solar and Heliospheric Observatory, or SOHO. It stares at the Sun all the time, monitoring its activity. This image, from May 3, 2012 is from the LASCO C3, one of the cameras on board. It has a little metal paddle (called an occulter) to block the ferocious light of the Sun; that’s the black bar and circle. The white outline is the position of the Sun and its size in the image.
You can see an emerging coronal mass ejection on the left: that’s the bulb-shaped thingy. It’s actually an incredibly violent expulsion of a billion tons of subatomic particles hurled away at high speed due to the explosive discharge of the Sun’s magnetic field… but that’s not why I posted this picture.
You can also see streamers coming from the Sun; those are places where particles flow freely into space from the Sun. Basically, the magnetic field of the Sun trails into space in those locations, allowing the wind to escape. But that’s not why I’m showing you this picture, either.
Look on the left. See that weird dot with the horizontal line through it? That’s Jupiter! The line is not real; it’s where the camera got overexposed by the planet (digital detectors — like your phone camera — convert photons of light into electrons, and if a source is too bright, the electrons overflow the pixels like water from a bucket. The way the camera works, the electrons flow along the horizontal grid of pixels, creating these lines. This is called "blooming").
Jupiter has been gracing our sky for months, but has been getting further west every night, closing the apparent distance between it and the Sun. It’s on the opposite side of the Sun from us, at a distance of almost 900 million kilometers (550 million miles). When two objects get close in the sky, it’s called a conjunction. When it’s a planet on the far side of the Sun, it’s called superior conjunction. Just so’s you know.
Anyway, I just think this is neat. Jupiter is roughly one-billionth as bright as the Sun, yet there it is in the picture! And even though SOHO is designed to look at the Sun, Jupiter is so bright it’s overexposed. Imagine if the spacecraft moved a bit and the Sun were to peek out from behind the occulter… which can happen. SOHO goes into "safe mode" when that happens, shutting down systems that might get damaged. Every astronomical satellite has contingency plans like that, since it’s hard to send a repair service to most of ’em. Generally it’s fixable by sending software commands to the spacecraft once the underlying problem has been ascertained.
If you want, SOHO has images online that are updated constantly. Go see what the Sun is doing now! Over the next few days Jupiter will get closer to the Sun, then pass very close to or even behind the disk. LASCO 2, another camera on SOHO that has a smaller field of view but a bit more resolution, should show the moons too when Jupiter moves into its field. I’ll post again when that happens. That’ll be even neater.
Image credit: NASA/ESA/SOHO
CATEGORIZED UNDER: Astronomy, Pretty pictures MORE ABOUT: CMEs, Jupiter, SOHO The Sun is 1,392,684 +/- 65 km across!
By Phil Plait | March 21, 2012 7:00 am You would think that, of all the astronomical measurements we could make, one of the best known would be the diameter of the Sun.
You’d be wrong. It’s actually really hard to measure! For one thing, we sit at the bottom of an ocean of air, gas that is constantly moving around, mucking up precise measurements. For another, our telescopes themselves can be difficult to calibrate to the needed accuracy to get a really solid measurement of the size of the Sun.
Nature, however, provides us with a way to measure our nearest star. Naturally! And it involves Mercury, the smallest and, critically, the closest planet to the Sun.
In 2003, and again in 2006, Mercury passed directly across the face of the Sun as seen from Earth (the picture above is a view of the 2006 transit as seen by SOHO; a very short animation was made from this as well). Mercury’s orbit is tilted a little bit with respect to Earth’s, so these transits don’t happen terribly often, occurring only every few years. But because we know the orbit of Mercury so well, and our own distance from the Sun, by precisely timing how long it takes the diminutive world to cross the Sun, we can get a very accurate measurement of the Sun’s diameter.
A team of scientists did exactly this, using SOHO, which is a solar observing and solar-orbiting satellite. Because it’s in space, it doesn’t suffer from the problems of peering through a murky, dancing atmosphere. They were able to measure the timing of Mercury’s passage of the Sun to an accuracy of 3 seconds in 2003 and 1 second in 2006. They had to take into account a large number of effects (the Sun’s limb is darker than the center, which affects timing; they had to accurately measure the position of Mercury; they had to account for problems internal to SOHO like focus and the way it changes across the detector; and, of course, correct for the fact that Mercury cut a chord across the Sun and didn’t go straight across the diameter — but that only took knowledge of Mercury’s orbit and some trig) but when they did, they got the most accurate measure of the Sun’s diameter ever made: 1,392,684 +/- 65 km, or 865,374 +/- 40 miles.
That uncertainty of 65 km is quite a it better than what can be done from the ground, amazingly. It may sound like a lot, but it actually represents an accuracy of 99.995%! The Sun is big. Really, really big.
… and they’re not done. The authors are going to observe the Transit of Venus coming up in June, hoping it’ll improve their measurements. I’ll be very curious to see how that goes; Venus has an atmosphere which I would think would confound the observations. They may have ways around that though.
Either way, I think this is completely fascinating. Even thrilling! The Sun is the brightest thing in the sky, the center of our solar system, the basis of light and heat and life on Earth, and the best-studied star in the Universe.
And here we are, just now figuring out how big it is. Sometimes the simplest things can be the hardest, I suppose.
Image credit: NASA/EDA/SOHO
– Giant sunspots are giant
– The Sun’s angry red spot
– For your viewing pleasure: Active Region 1302
– The boiling, erupting Sun
CATEGORIZED UNDER: Astronomy, Cool stuff, Science, Top Post MORE ABOUT: diameter of the Sun, Mercury, SOHO, Sun, transit The Sun ate another comet
By Phil Plait | March 16, 2012 7:00 am It’s tough to be a comet.
You spend most of the time — billions of years, really — out in deep space where it’s cold and dark. Of course, since you’re mostly made of ice, that’s not so bad. After all, the Sun is hot, and if you venture too close…
Well, you know what happens then. And such was the fate of Comet SWAN, discovered just a few days ago as it plunged headlong into the seething fires of the Sun. And I have video!
That was made from images taken by NASA’s SOHO satellite. In fact, the comet is named SWAN because it was first seen in the SOHO SWAN camera, designed to look for ultraviolet light coming from hydrogen. Here’s the thing: no comet has ever been seen before in that camera, including the phenomenally bright comet Lovejoy from a few months ago. But Lovejoy got incredibly bright overall, while this new comet never did brighten much. Comet SWAN must have undergone some sort of outburst to make it so bright and then fade again; that’s happened before.
Here’s another shot of it from SOHO:
[Click to enhalleyenate.]
Comets like these are called Kreutz family Sun grazers, a collective group of comets on similar orbits that take them very close to the Sun’s surface. Some survive, like Lovejoy did, and some… don’t.
The Sungrazing Comets site has lots more info on this weird comet and its untimely death. You can also follow SungrazerComets on Twitter for current news on these doomed chunks of ice.
Image credit: NASA/SOHO. Music in the video was "Heavy Interlude" by Kevin MacLeod, used under Creative Commons license from incompetech.com.
– Amazing video of comet on a solar death dive
– The comet and the Coronal Mass Ejection
– Amateur astronomer discovers sungrazing comet
– One more Lovejoy time lapse… maybe the last
CATEGORIZED UNDER: Astronomy, Cool stuff, Pretty pictures, Top Post MORE ABOUT: comet, Comet Lovejoy, Comet SWAN, Kreutz family comets, SOHO, sungrazer The Sun aims a storm right at Earth: expect aurorae tonight!
By Phil Plait | January 24, 2012 6:00 am Around 04:00 UTC on Monday morning, January 23, 2012, the Sun let loose a pretty big flare and coronal mass ejection. Although there have been bigger events in recent months, this one happened to line up in such a way that the blast of subatomic particles unleashed headed straight for Earth. It’s causing what may be the biggest space weather event in the past several years for Earth: people at high latitudes can expect lots of bright and beautiful aurorae.
I’ll explain what all that is in a second, but first here’s a video of what this looked like from NASA’s SOHO satellite.
Wow! Make sure you set it to high def.
So what happened here? The sunspot cluster called Active Region 11402 happened.
Sunspots are regions where the magnetic field lines of the Sun get tangled up. A vast amount of energy is stored in these lines, and if they get squeezed too much, they can release that energy all at once. When this happens, we call it a solar flare, and it can be mind-numbing: yesterday’s flare exploded with the energy of hundreds of millions of nuclear bombs!
In the image above, the sunspots are caught in mid-flare, seen in the far ultraviolet by NASA’s Solar Dynamics Observatory (it’s colored green to make it easier to see what’s what). We think of sunspots as being dark (see the image of AR 11402 below), but that’s only in visible light, the kind we see. In more energetic ultraviolet light, they are brilliant bright due to their magnetic activity.
A huge blast of subatomic particles was accelerated by the explosion. The first wave arrived within a few of hours of the light itself… meaning they were traveling at a significant fraction of the speed of light!
But shortly after the flare there was a coronal mass ejection: a larger scale but somewhat less intense event. This also launches particles into space, and these are aimed right at us. The bulk of the particles are traveling at slower speeds — a mere 2200 km/sec, or 5 million miles per hour — and is expected to hit us at 14:00 UTC Tuesday morning or so. That’s basically now as I write this! Those particles interact with Earth’s magnetic field in a complicated process that sends them sleeting down into our atmosphere. We’re in no real danger from this, but the particles can strip the electrons off of atoms high in the air, and when the electrons recombine the atoms glow excite the electrons in atoms high in the air, and when the electrons give up that energy the atoms glow. That’s what causes the aurorae — the northern and southern lights.
If you live in high latitudes you might be able to see quite the display when it’s dark — people in eastern Europe and Asia are favored for this, since this happens after sunset there. But the storm is big enough and will probably last long enough that everyone should check after dark: look north if you live in the northern hemisphere and south if you’re south of the Equator. There’s no way in advance to know just how big this will be; it might fizzle, or it might be possible to see it farther away from the poles than usual. Can’t hurt to look! Also, Universe Today has been collecting pictures of aurorae from the solar blast earlier this week. No doubt they’ll have more from this one as well.
Although big, this flare was classified by NASA as being about M9 class — powerful, but not as energetic as an X class flare. One of those popped off last September, and shortly after that a smaller M flare erupted, which also triggered a gorgeous plasma fountain called a filament on the Sun’s surface.
As I said, we’re in no real danger here on Earth, and Universe Today has a good article describing why the astronauts are probably not in danger on the space station, either. Even if this were larger storm, the astronauts can take shelter in more well-protected parts of the station, too. Bigger storms can hurt us even on Earth by inducing huge currents in power lines which can overload the grid. That does happen — it happened in Quebec in March of 1989 — and it may very well happen again as the Sun gets more active over the next few years. [UPDATE: a ground current surge from today’s event was reported in Norway.]
But we should be OK from this one. If you can, get outside and look for the aurorae! I’ve never seen a good one, and I’m still hoping this solar cycle will let me see my first.
Image credit: NASA/SOHO; NASA/SDO
– Awesome X2-class solar flare caught by SDO
– Gorgeous flowing plasma fountain erupts from the Sun
– NASA’s guide to solar flares
CATEGORIZED UNDER: Astronomy, Cool stuff, NASA, Pretty pictures, Top Post MORE ABOUT: aurora, coronal mass ejection, SDO, SOHO, solar flare, space weather, Sun The Sun fries a comet and we got to watch
By Phil Plait | January 19, 2012 4:11 pm In July of last year, I wrote about a comet that passed extremely close to the Sun. Astronomers have now had a chance to pore over that data, and were able to determine some very cool stuff.
First, here’s the video of the comet’s fiery demise (watch it in HD to make it easier to spot the comet):
See it? It’s faint, but there. Actually, there are a lot of observations from multiple observatories and detectors, which allowed astronomers to find out quite a bit about this doomed chunk of ice and rock.
For one thing, it was screaming along at about 650 kilometers per second (400 miles/second) as it flamed out. To give you an idea of how flippin’ fast that is, it would’ve crossed the entire United States in about eight seconds.
It also passed an incredible 100,000 km (62,000 miles) above the Sun’s surface. Have you ever stood outside on a hot day, and thought the Sun would cook you? Now imagine the Sun filling half the sky. That’s what that comet saw. No wonder it disintegrated.
As it approached the Sun, it was watched by NASA’s Solar Dynamics Observatory. In its final 20 minutes or so, the comet broke up into a dozen pieces ranging from 10 – 50 meters in size (and no doubt countless smaller ones too small to detect), with a tail of vaporized material streaming behind it that went for thousands of kilometers. For that size, it would’ve had a mass of hundreds of thousands of tons — about what a loaded oil tanker weighs on Earth!
We’ve learned a lot about how comets break up and disintegrate by observing this event, but it’s raised further questions: like, why did we see this at all? Comets are faint, and to be able to see it this way against the bright Sun is odd. It was definitely one of the brightest comets seen, but it’s interesting to me that it appears to glow in the ultraviolet, as it did in the above video. That means, at that wavelength, it was brighter than the Sun! It wasn’t like a meteor, burning up as it slammed through material, so some other process must have affected it. I suspect that the Sun’s strong magnetic field may have had something to do with it; in the far ultraviolet magnetism is a strong player. Gas under the influence of intense magnetic fields can store a lot of energy, which is why sunspots — themselves the product of magnetic squeezing — look bright in UV.
Perhaps as the comet broke up, the particles inside got excited by the magnetic fields of the Sun and glowed. I’m no expert, and I’m spitballing here. The thing is, no one is exactly sure. But that doesn’t mean we won’t find out. Nothing makes a scientist’s noggin itch as much as a mystery like this, something apparently misbehaving.
One of the single most important words in science is "yet". We don’t know yet. But we will. Someone’ll figure this out, and we’ll have one more victory in our quest to better understand the Universe.
Science! I love this stuff.
Credits: Credit: NASA/SDO; SOHO (ESA & NASA) Related posts:
– NASA’S SDO captures final moments of a comet streaking across the Sun
– Ten Things You Don’t Know About Comets
CATEGORIZED UNDER: Astronomy, Cool stuff, Pretty pictures, Science MORE ABOUT: comet, Kreutz family comets, SDO, SOHO, sungrazer A celestial visitor, seen from space
By Phil Plait | December 22, 2011 11:26 am I know I post a lot of pictures I describe as amazing, lovely, breath-taking, jaw-dropping… but that’s only because it’s always true. In this case, though, I think those adjectives fall way, way short in describing the seriously paralyzing beauty of this photograph: Comet Lovejoy, as seen by an astronaut on board the International Space Station:
[Click to encomanate — and yes. you need to.]
Oh. My.
This stunning photo was taken by astronaut Dan Burbank as the ISS passed over Australia at 17:40 GMT on December 21, 2011 [update: more pix here]. It was early morning over Australia at the time, and you can see the dark limb of the Earth, the thin green line of airglow (atoms in the upper atmosphere slowly releasing the energy they accumulated over the day), some southern hemisphere stars… and of course, the incredible, ethereal, other-worldly beauty of Comet Lovejoy, its tails sweeping majestically into the sky.
Wait, what? "Tails", plural? Yup. Hang on a sec. I’ll get to that.
First, the comet was discovered by amateur astronomer Terry Lovejoy in November. It turned out to be a sungrazer, a comet whose orbit plunges it deep into the inner solar system and very close to the Sun’s surface. It screamed past our star last week, on December 15/16, and, amazingly, survived the encounter. Some sungrazers do and some don’t, but Lovejoy is bigger than usual for such a comet, and that may have helped it remain intact as it passed less than 200,000 km over the Sun’s inferno-like surface.
Now the comet is moving back out, away from the Sun and back to the frozen depths of deep space. But the Sun’s heat, even from its greater distance now, is not to be denied. Comets are composed of rock and ice — the ice being what we normally think of as liquid or gas, like ammonia, carbon dioxide, and even good ol’ water. The heat from the Sun turns that ice directly into a gas (in a process called sublimation), which expands around the solid nucleus of the comet, forming what’s called the coma. Pressure from sunlight as well as the solar wind blows this material away from the comet head, resulting in the lovely tail, which can sweep back for millions of kilometers.
CATEGORIZED UNDER: Astronomy, Cool stuff, Pretty pictures MORE ABOUT: Comet Lovejoy, dust tail, ion tail, ISS, SOHO Lovejoy lives!
By Phil Plait | December 16, 2011 7:00 am Comet Lovejoy was only discovered in late November, but it’s had quite a ride. It was quickly determined to be a Sun-grazer, the kind of comet that plunges down very close to the Sun in its orbit. The date of this solar close encounter: yesterday!
That’s a shot of it using SOHO, a solar observatory orbiting the Sun. The Sun itself is blocked by a mask, and the white circle represents its outline. The comet is obvious enough! The line through the top of it is not real; that’s called blooming and it happens sometimes when a bright object is seen by a digital detector. The electrons in the chip overflow the pixels and leak into adjacent ones. The comet got very bright as it neared the Sun, almost as bright as Venus! This picture, taken on December 15th at 22:36 UT, was shortly before closest approach: a mere 180,000 km (110,000 miles) from the Sun’s searing surface.
Amazingly, after the comet screamed past the Sun, and to the surprise of many, it survived. A lot of comets don’t make it through such an event, but this one did. Here’s a video of the comet reappearing from behind the Sun, as seen by SDO; watch closely or you’ll miss it!
Nifty. But on the way down it had several interesting things happen to it. Read More
CATEGORIZED UNDER: Astronomy, Cool stuff, Pretty pictures MORE ABOUT: Comet Lovejoy, SDO, SOHO, STEREO, Sun, sungrazer The comet and the Coronal Mass Ejection
By Phil Plait | October 4, 2011 9:30 am On October 1, a bright comet screamed into the Sun, and apparently disintegrated. This happens pretty often, actually, but in this case, just minutes later, the Sun blew out a pretty hefty coronal mass ejection, a huge explosion of magnetic energy that can release billions of tons of material.
Some people have speculated that these two things are related (including times when this has happened in the past). Are they? We have videos of the event from three different satellites, giving us three angles on what happened, providing clues on what really occurred.
To shed some light on this — haha — I made a short video explaining this, including the footage of the comet collision and CME as seen by the three satellites:
[It helps to set the video resolution to 720p to see the details in the satellite views.]
So my guess is that while it’s possible, it’s not probable. CMEs happen all the time, so I’d expect a few to happen around the same time as comets flying past the Sun just by coincidence. We don’t have any physical reason to think they’re related, and when they are examined more closely, the CMEs usually don’t come from a spot near the Sun where the comet traveled. Still, it’s worth looking into, at least to build up a statistical case on way or the other.
The folks at SOHO — the Solar Heliospheric Observatory — have a post up with more info. Also, if you want to see the three satellite videos on their own, here is the SOHO video
the STEREO A video, and the STEREO B video.
Very special thanks to SungrazerComets on Twitter for making the three original satellite animations. That’s a good stream to follow if you want the latest on comets making death dives onto our star. [UPDATE: @SungrazerComets just posted an excellent and thorough article about this topic, too!]
Image credits: NASA, SOHO, STEREO
– The Sun blasts out a flare and a huge filament
– Solar storm tracked all the way from the Sun to Earth
– STEREO sees an ethereal solar blast
CATEGORIZED UNDER: Astronomy, Cool stuff, Pretty pictures, Top Post MORE ABOUT: comet, coronal mass ejection, SOHO, STEREO, Sun Summer solstice 2011
By Phil Plait | June 21, 2011 6:30 am Today, June 21, 2001, at 17:16 UTC (1:16 p.m. Eastern US time), the Sun will reach its peak in its northward travels this year. This moment is the summer solstice — I describe this in detail in an earlier post. Technically, that article is for the winter solstice, but the idea’s the same. Just replace "winter" with "summer" and "December" with "June" and "south" with "north". That should be clear enough. It might be easier just to multiply the entire article by -1. Or stand on your head.
Since for the majority of people on the planet this day marks the start (or more commonly the midpoint) of summer, enjoy the gallery below that shows our nearest star doing what it does best: giving us light, giving us beauty, and sometimes, blowing its top.
Use the thumbnails and arrows to browse, and click on the images to go through to blog posts with more details and descriptions.
CATEGORIZED UNDER: Astronomy, Pretty pictures MORE ABOUT: Glenn Schneider, Hubble Space Telescope, magnetic fields, prominence, SDO, SOHO, solar flare, STEREO, summer solstice, Sun, sunspot NEW ON DISCOVER | 科技 |
2016-40/3982/en_head.json.gz/17593 | Half of hidden heritability found (for height, at least)
This is a quite interesting paper, as it shows, by sampling a large number of individuals), that the heritability of height is not missing after all. The authors looked at a large number of individuals, and this allowed them to discover statitically significant associations between height and more SNPs than before.This bears great promise as it may hint that genome-wide association studies, that have come under substantial criticism lately, may be failing not because of an inherent flaw, but rather because they are not sampling enough individuals.The discovered SNPs account for 45% of the heritability of height. Where is the rest? The authors argue for two additional sources:First, SNPs in current microarray chips sample the genome incompletely. Locations in-between discovered SNPs are in incomplete linkage disequilibrium with the discovered SNPs. So, there is undetected polymorphism, in the gaps between the hundreds of thousands of SNPs in current chips, that may explain a portion of the missing heritability.Second, SNPs have different minor allele frequencies. For example, in one SNP the minor allele may occur at 10% of individuals, while in others at 30%. This is important, because it is more difficult to arrive at a statistically significant result in the former case. Consider a SNP with a minor allele frequency of 2%. Then, if you sample 1,000 individuals, only about 20 of them are expected to have the minor allele. You cannot estimate the average height of the minor allele with a sample of 20 people as securely as you can with a sample of 500. Thus, if the SNP influences height in a small way, you will not be able to detect it.A further complication, which I've written about before, is that some variation in the human genome is family-related, or at least occurs at fewer individuals than the allele frequency cutoff. If 99.9% of people have C at a given location and 0.1% of people have T, this variant is unlikely to be included in a microarray chp, because it is too rare to matter economically: you would only get a handful of individuals -if you're lucky- in a sample of 1,000 for such a variant. However, rarity does not mean that the variant is functionally unimportant, and the rare allele may play a substantial role in the height of the people who possess it.The publication of this paper is a cause for optimism, as it shows that progress can be made by brute force: fuller genome coverage and more individuals. We'll have to wait and see whether or not the same approach will work for other complex traits, such as IQ or schizophrenia, that have been hitherto difficult to crack. Obviously, the cost of sampling more individuals will become an issue in future studies, but the cost-per-individual is expected to drop. So, I'm guessing that more discoveries are in store for us in the next few years.UPDATE (Jun 28):Not the main point of the paper, but also included in the supplementary material (pdf) are some nice PCA results.In the European-only PCA we see the familiar north-south gradient (anchored by Tuscans TSI and Netherlands NET on either side), and the orthogonal deviation of the Finns. Swedes (SWE) occupy a northern European end of the spectrum like the Dutch, but are spread towards Finns, reflecting low-level Finnish admixture in that population. Conversely, Finns are variable along the same axis, reflecting variable levels of admixture. Australians (AUS) and UK, on the other hand, are on the northern European edge of the main European gradient, with a number of individuals spread toward the Tuscan side.The PCA with all populations is also quite interesting. East Eurasians (Chinese and Japanese) form a tight pole at the bottom right. Gujarati Indians (GIH) form a different pole, spread towards Europeans, reflecting variable levels of West Eurasian admixture in that population, probably corresponding to the ANI element recently discovered in Indian populations. Mexicans (MEX) are spread towards East Asians, reflecting their Amerindian admixture, but notice how they are not positioned exactly on the European-East Asian axis, probably reflecting the third, minority, Sub-Saharan element in their ancestry, as well as the fact that Amerindians are not perfectly represented by East Asians. Finns are tilted towards East Asians, as expected, reflecting the fact that their genetic specificity vis a vis Northern Europeans is due to low-level East Eurasian ancestry.An interesting aspect of the first two PCs is the fact that the Maasai (MKK) and Luhya (LUW) from Kenya are not separated from Caucasoids, and neither are Yoruba from Nigeria (YRI). This is a good reminder of the fact that identity in the first two principal components may mask difference revealed in higher order components. This difference (at least for Maasai) is seen in the next two PCs.Nature Genetics doi:10.1038/ng.608Common SNPs explain a large proportion of the heritability for human heightJian Yang et al.AbstractSNPs discovered by genome-wide association studies (GWASs) account for only a small fraction of the genetic variation of complex traits in human populations. Where is the remaining heritability? We estimated the proportion of variance for human height explained by 294,831 SNPs genotyped on 3,925 unrelated individuals using a linear model analysis, and validated the estimation method with simulations based on the observed genotype data. We show that 45% of variance can be explained by considering all SNPs simultaneously. Thus, most of the heritability is not missing but has not previously been detected because the individual effects are too small to pass stringent significance tests. We provide evidence that the remaining heritability is due to incomplete linkage disequilibrium between causal variants and genotyped SNPs, exacerbated by causal variants having lower minor allele frequency than the SNPs explored to date.Link
Heritability,
Hi, Dienekes. Interesting; both the paper (for what I could read) and your analysis. I imagine however that it will be most difficult to do GWAS when many or most traits' genetics are distributed probably in so many small subsets of people, right? It would seem (on first sight at least) that many traits are not defined by single instances of widespread genes but by a wide variety of roughly synonymous but different genes, right?...On a separate matter and rather a minor issue, I'd like if someone could clarify my perplexity at supp. fig. 2a (the supplementary material is freely available, it seems). The legend "explains" that the graphs are "principal component analysis (PCA) of ancestry", where the Australian sample (and only them) was not used to generate the PC space but introduced on it after that. And that for fig. 2a: "The major trend, Principal Component (eigenvector, PC) 1, tends to separate African from non-African population while PC2 separate East Asian from the others".That would be what one could expect (based on all other data I know) but the actual plot shows that PC1 separates East Asians from the rest and PC2 Indians (GIH) from the rest, with Europeans and African clumping together in the same corner. I'm totally amiss if this is an error or what and would love if someone could explain.
There are three different African populations, so I tend to think that the figure shows what it shows, i.e., Gujarati Indians on the top.Africans are indeed cleanly separated from non-Africans on PC1, however, it is just that the there is a small visible gap between MKK/YRI/LWK and Europeans.
"... it is just that the there is a small visible gap between MKK/YRI/LWK and Europeans".Precisely: in practical terms Europeans and Africans cluster together. This is totally contradictory with what we could expect (based on all other available data) and with the legend. I suspect it must be an error, with the wrong graphs being plotted instead of the correct ones. Otherwise it doesn't make any sense.
Unfortunate that they didn't sample people of the Dinaric Alps.
The legend "explains" that the graphs are "principal component analysis (PCA) of ancestry", where the Australian sample (and only them) was not used to generate the PC space but introduced on it after that.Because a minority of Australians tested had been excluded from the PCA plots due to their obvious non-European ancestry, so using the rest of Australians (however bulk of them) in the generation of the PCs would give an unnaturalness (however little) to the PCA plots, as that would affect the non-Australian populations included in the PCA plots.
Annie Mouse
Good point MajuIt looks to me like the Eigenvector 2 scale for the Gujarati is different from the rest. I expected it to track along the line towards the Asians.Plus the range on the Eigenvector 2scale is much too large for the rest of the populations, scrunching up the data vertically.
pp987
I realized the reason of the confusion about the PC plots. Well, the PC plots should be viewed as a 4 sided pyramid (tetrahedron), with 4 vortices. Try to visualize that. That's why Africans and European look really close, but they're only close bidimensionally. If viewed 3D, one wold notice they are just 2 of the 4 vortices of an imaginary tetrahedron. For easy visualization: http://en.wikipedia.org/wiki/Tetrahedron
Cuah123
Rangel-Villalobos H et al 2008, as there's a difference between Western and Eastern Mexicans, with a % of African ancestory between both.
@pp987:You just made that up (or in any case it's clearly wrong). The PC space is a bidimensional euclidean space. In a few cases, where the PC3 axis is included in form of color or a projection of of a 3D space, then you'd be partly right (it's be actually an octahedron or cube, depending on what do you emphasize) but it's not the case here. It must be an error where a different plot has been printed instead of the one that should be there. It has no other explanation.Said that I wouldn't mind to know what that graph actually refers to (maybe PC3-PC4?) because it does indicate an Euro-African affinity at some level that is not so surprising to me. But cannot be at PC1-2 level because we all have seen by now dozens of such global PC1-2 plots and they do not produce these results.
Given the strong association of diet and stature, I'd be suspicious in an international sample of assuming that correlations between SNPs and stature are really genetic causes.Ancestry specific traits that happen to coincide with diet may look experimentally like a source of heritability of height.
Tuesday, June 29, 2010 10:12:00 pm
I see no reason to think that the PC plot is wrong. People forget that the axes of PCA are not set in stone but are a function of the individuals included. Sure, globally, the greater genetic distinction is between Sub-Saharan Africans and Eurasians, but you are not guaranteed to get that every time you run PCA on a dataset.To give an intuitive example, imagine having to explain variation with a single variable (i.e. PC1) in a room with 100 Japanese, 100 Englishmen, and 10 Nigerians. Nigerians are about 50% more distant from Englishmen than Japanese are, but there are 10 times more Japanese in the room than Nigerians. So, if you used PC1 to capture the Sub-Saharan/Eurasian distinction you would not capture most of the variation that could be captured with a single variable.Going back to the figure, I see at least two reasons why Europeans and Africans are close in the first two PCs. First of all, this plot, unlike previous ones many people are familiar with, is heavy on the East Africans. East Africans occupy an intermediate position between Yoruba and Europeans, and are thus as close to them (in a rough sense) as Mongoloids are.Second, the plot includes both Gujarati Indians, and Mexicans. Thus, you already have two populations that don't map well in a 3-pole scheme (Africans, East Asians, Europeans) as the former are Caucasoid-South Asian and the latter are European-Amerindian. In short there is no a priori reason to think that the plot is flawed. It could be, of course, but there's no rule set in stone that PCA will always produce a 3-pole African/European/East Asian in its first two components.
Wednesday, June 30, 2010 12:16:00 pm
Not sure if what you say, Dienekes, is correct or not. In theory it could make some sense but in practice the plot is clearly contradicted by the legend:"The major trend, Principal Component (eigenvector, PC) 1, tends to separate African from non-African population" (no, that's not what it does: it separates East Asians from the rest) "... while PC2 separate East Asian from the others" (no, that's PC1, PC 2 in that plot separates Indians from the rest). It happens the same with the PC3-4 graph: no correlation with the description at the legend. It's a clear error, sorry for raising the issue.
Wednesday, June 30, 2010 3:44:00 pm
I did say that the plot COULD be wrong.What the mixup is about remains to be seen (whether they put the wrong plot, or were sloppy with their description).Another good example of what I am referring to is from the recent study on Qatar where the first principal component is the Asian one, while the Sub-Saharan one is on the second component. In general, even though the greatest contrast within the human species is between Sub-Saharan Africans and Eurasians, it is not always guaranteed that this contrast will be mapped on the first PC.
I totally agree that PCA, like other statistical measures, is indeed subject to distortions because of sampling bias and/or subtle factors. The case of Qatar is probably caused by the fact that Qataris, like other peninsular Arabs, have some meaningful ultra-Saharan ancestry, what makes them closer overall to Africans than to East Asians. In this case African ancestry of Qatari is the "subtle factor" and the emphasis on Qataris the "sample bias", intentional in that study. But, even then, the Global PC graph clearly shows the triangular structure that we are used to. And South Asians are represented there too, as is obvious from the scatter of the "Asian" sample between West Eurasia and East Asia. The Qatar-World PC graph is normal, reasonable... it poses no problems and fits well with what we know from other data.But this case is completely different: while the legend says normal things, the graph does not correlate neither with the legend nor the expectations.
ultra-SaharanYou mean sub-Saharan (as ultra=beyond)?
Thursday, July 01, 2010 12:44:00 am
Regarding the statement in the supplement:"the actual plot shows that PC1 separates East Asians from the rest and PC2 Indians (GIH) from the rest, with Europeans and African clumping together in the same corner. "I believe they meant to say:"the actual PC1 plot shows that the eigenvector 1 dimension separates East Asians from the rest and the eigenvector 2 dimension separates Indians (GIH) from the rest, with Europeans and African clumping together in the same corner."
"You mean sub-Saharan (as ultra=beyond)?"I mean ultra-Saharan as "beyond the Sahara", yes. I do not mean sub-Saharan as "under the Sahara" because no people lives there (and it's a racist term IMO). Other possible terms are trans-Saharan (like in "Transalpine Gaul" but can be confused with "trans-Saharan routes", meaning "across" instead of "beyond") and one I use often: "Tropical Africa" (but technically excludes the southernmost tip of Africa, which is subtropical in fact).
Thursday, July 01, 2010 12:58:00 pm
I do not mean sub-Saharan as "under the Sahara" because no people lives there (and it's a racist term IMO).If you mean by the common and academically well established term "sub-Saharan Africa", like me and all other people I know, regions of Africa that lie further south than Sahara, of course people live there. And it isn't remotely racist (it is purely based on cartography, not anything else), also academics use it frequently. And lastly, unlike your alternative, it is direction-neutral.
Thursday, July 01, 2010 7:41:00 pm
I wouldn't like to hijack the thread on this matter so you may want to continue this branch of the discussion at Leherensuge: 'Super-Saharan Africa' article, which is a short article I wrote on the matter in 2008. I'm sure that Dienekes and other readers will thank that we divert the branch debate to somewhere else. In any case: "sub" means "under" or "inferior" and is generally used in a negative sense, like subhuman, subnormal, submissive, subordinate, etc. It is not an "academic" term but one originated in the European mass media in relation to immigration when using the word Black (the traditional term for good or bad) was perceived by some to be "racially charged", so some "genius" journalist some day got a map and, voilá!, invented a new word out of the convention of placing the south at the bottom of maps. Somehow (European media sloppiness as far as I can recall) it became mainstream in the late 80s or rather early 90s but not without some raising our eyebrows and our protests. Etymologically and geographically it's an incorrect term and IMO has a clear attitude of disrespect. I'd rather use Black Africa, sincerely.
Maju."Etymologically and geographically it's an incorrect term and IMO has a clear attitude of disrespect."subarcticsubterraneansubsiduarysubconscioussubductionAre these also terms that must be excised from the English Language.Africans themselves use the term as in Sub-Saharan Publishers.There can be only one up and one down and for a while, North has been up. It's a little late to change it.I don't hear the Australians complaining about being from the "land down under." Or is that also a disrespectful term?Seriously, the Africans have bigger fish to fry.
Marnie: I have already said that I don't want to continue the discussion on the term "sub-saharan" here because it's not the main subject of discussion and I'm sure that Dienekes is going to get pissed off.Suffice to say that I don't use it for the reasons explained. Anyhow I strongly suspect it's a deformation of "Sud-Saharien/-ano", a valid Romance term (like Sudamérica).
Friday, July 02, 2010 11:40:00 pm
Maju, as you don't want to further this semantic debate, I will finish it by saying that I don't share your super, hyper, over or ultra (I don't know which one is the least racist for you) sensitivity.
Saturday, July 03, 2010 2:33:00 am
Earliest copper smelting from the Balkans
Half of hidden heritability found (for height, at ...
Genetic structure of Qatar (Hunter-Zinck et al. 20...
Y chromosome haplogroup I and disease-based select...
Population structure in Ireland and Britain (O'Dus...
Genes predict village of origin in rural Europe (O...
Modern humans bite hard
In search of Dionysos. Reassessing a Dionysian con...
Brown-eyed men perceived to be more dominant
Radiocarbon based chronology for ancient Egypt (Ra...
Wealth and people and the invasion of alien specie...
Mediterranean diet improves autonomic heart functi...
Y-chromosome of Caravaggio
Autosomal haplotype shared by Arab Muslims and Ori...
Two waves of expansion from East Asia into the Ame...
Composite Argentinean player (FIFA 2010 World Cup)...
Composite Greek player (FIFA 2010 World Cup)
IFIH1 points to ancient structure in Africa
Composite faces of Iran
Genome-wide structure of Jews (Behar et al. 2010)
Guess the origin of another composite man
Guess the origin of this composite man
Osteogenin (BMP3) selection in humans
Facial bone structure changes with age
Two major groups of living Jews (Atzmon et al. 201...
Changes in disease/trauma during the Neolithic tra...
Genetic distinctiveness of Basques (?)
Why do we have chins? (Thayer and Dobson 2010) | 科技 |
2016-40/3982/en_head.json.gz/17622 | e360 digestEnergy
Could California’s Gridlock Generate Electricity for the Grid? California is testing whether its heavy traffic can produce not just emissions and air pollution, but electricity. Traffic on Interstate 80 near Berkeley, Calif.
The state’s Energy Commission says it will spend $2 million to examine the potential of using piezoelectric crystals embedded under asphalt as a way to send the energy created by moving cars to the grid. The crystals generate energy when compressed by the weight of moving cars, but tests of the technology at larger scales have failed or been canceled in Tokyo, Italy, and Israel, according to the Associated Press. California, therefore, “needs to figure out whether it can produce high returns without costing too much,” the AP writes. If successful, the technology could help the state reach its goal to generate 50 percent of its electricity from renewable sources by 2030. California is expected to hit a 25 percent renewables target by the end of this year.
China Leads in Wind Installation, But Continues to Prioritize Coal in the Grid
China built two wind turbines every hour in 2015, double that of the U.S., according to the International Energy Agency. The country is installing enough wind to meet all of its new energy demand, more than 30,000 megawatts last year. Despite this promising development, however, the IEA told BBC News that China is giving coal-fired power plants priority access to the grid over wind, hampering the country’s pledge to get an increasing share of its electricity from renewable energy sources. “The rather rosy statement on wind energy hides the issue that 2015 and the first half of 2016 also saw record new installations of coal,” an IEA spokesman said. “China has now a clear over-supply. In the province of Gansu, 39% of wind energy had to be curtailed (turned off).”
Clinton vs. Trump: A Sharp Divide Over Energy and the Environment
Environmental and energy issues have received relatively little attention from the two major-party candidates in the 2016 U.S. presidential campaign. But when Donald Trump and Hillary Clinton have spoken out on these issues, the differences — like just about everything else about this campaign — have been stark. In a chart, Yale Environment 360 compares what Clinton and Trump have said on topics ranging from climate change to coal. See the graphic.
Costa Rica Runs on Renewable Energy For More Than Two Months Straight
Costa Rica has generated 100 percent of its electricity from renewable energy 150 days so far this year, including all of the past two months, according to the Costa Rican Institute of Electricity, the nation’s main power provider. The country’s main source of renewable energy is hydropower, which accounted for 80 percent of Costa Rica’s electricity generation in August, according to Mashable. Another massive hydroelectric power plant, the Reventazón dam, is scheduled to come online in September, further boosting the nation’s hydroelectric production. Geothermal, powered by Costa Rica’s many volcanoes, generated another 12.6 percent of electricity. Wind and solar make up roughly 7 percent of generation. Experts say Costa Rica is on track to meet, if not beat, last year’s record 299 days of 100 percent renewable energy.
Scientists Find New Way To Convert Carbon Dioxide into Energy Scientists have discovered a way to convert greenhouse gas emissions into a fuel in a single step using a light-driven bacterium, according to new research published in the Proceedings of the National Academy of Sciences. A team of U.S. scientists, led by biochemists at Utah State University, used a modified version of the phototrophic bacterium Rhodopseudomonas palustris as a catalyst to break apart carbon dioxide and turn it into hydrogen and methane, the latter of which can be burned to generate electricity. "It's a baby step, but it's also a big step," said Utah State biochemist Lance Seefeldt, a co-author of the study. "Imagine the far-reaching benefits of large-scale capture of environmentally damaging byproducts from burning fossils fuels and converting them to alternative fuels using light, which is abundant and clean."
For China’s Massive Data Centers, A Push to Cut Energy and Water Use
China’s 1.37 billion people, many of them fully connected to the Internet, use an enormous amount of energy as they email, search the Web, or stream video. Solar panels atop a green data center in Hangzhou.
Indeed, the Chinese government estimates that the country’s data centers alone consume more electricity than all of Hungary and Greece combined. But as Chinese technology and internet businesses look to burnish their environmental credentials and lower costs of operation, many are working to run their massive computing facilities more sustainably. Globally, tech giants such as Microsoft, Google, and Amazon are making rapid progress in this field, as they boost energy efficiency at data centers and seek to completely power their operations using renewable energy.
July Electric Car Sales in China Rose by 188 Percent Over Last Year
Chinese consumers bought 34,000 new electric cars in July, a 188 percent jump over the same period last year, according to CleanTechnica, an energy and technology news organization. The monthly total puts China on track to sell 400,000 electrical vehicles in 2016, accounting for 1.5 percent of the total auto sales market — larger than annual EV sales in Europe, or the U.S., Canada, and Mexico combined. By the end of the year, China is projected to have 700,000 electric cars on its streets; the vast majority of EV sales, 96 percent, are for Chinese-made cars, including from manufacturers BYD Auto, Zhidou, and SAIC Motor. Tesla accounts for just 2 percent of EV sales in the country, and Porsche just 1 percent.
Ending U.S. Oil Subsidies Would Have Minimal Impact, Study Says
Eliminating the U.S. government’s $4 billion in annual petroleum industry subsidies would have only a minor impact on American oil and natural gas production and consumption, but would strengthen the country’s influence in pushing for global action to slow climate change, according to a report by the Council on Foreign Relations. The report said that ending the three major federal petroleum subsidies would cut domestic production by 5 percent by 2030, which would increase international oil processing by just one percent. U.S. natural gas prices could go up by as much as 10 percent, and natural gas consumption and production would likely fall about 4 percent, the study found. Petroleum industry subsidies are a political flashpoint, with many Democrats arguing for their elimination and Republicans saying they are vital to U.S. energy security. But the study’s author concluded that “U.S. energy security would neither increase nor decrease substantially” if the subsidies are ended.
Ukraine Looking to Turn Chernobyl Into a Massive Solar Farm
Chernobyl could soon start producing energy again — this time as a massive solar farm. Thirty years after the meltdown of the nuclear power plant, The ghost town Pripyat.
Ukraine is looking for investors for a 1-gigawatt solar farm in the 1,000-square-mile exclusion zone, where radiation levels remain too high for farming or forestry, reported Bloomberg. The project would cost $1.1 billion and transform Chernobyl into one of world’s largest solar installations. Government officials say that two U.S. investment firms and four Canadian energy companies have expressed interest in the project. The European Bank for Reconstruction & Development is also considering financing the solar farm. “The Chernobyl site has really good potential for renewable energy,” Ukraine’s environment minister Ostap Semerak said. “We already have high-voltage transmission lines that were previously used for the nuclear stations, the land is very cheap, and we have many people trained to work at power plants.”
Global Economy Has Reduced Its Energy Intensity By One-Third Since 1990 The global economy is becoming less energy intensive, using fewer fossil fuels to power productivity and economic growth, according to new data from the U.S. Department of Energy. Rooftop solar panels
Global energy intensity — a measure of energy consumption per unit of gross domestic product (GDP) — has decreased nearly one-third since 1990, the agency said. The U.S., for example, burned 5,900 British thermal units per dollar of GDP in 2015, compared to 6,600 BTUs in 2010. China burned 7,200 BTUs per dollar in 2015 versus 8,300 BTUs in 2010. The Department of Energy says the decrease is the result of the growth in low-carbon energy sources, such as wind and solar, and improved energy efficiency. “This is excellent news,” Penn State University climatologist Michael Mann told Climate Central. “The dramatic drop we are seeing in global energy intensity is a direct indication that energy efficiency measures are having a very direct impact on global carbon emissions.”
Six Years After BP Spill, Remaining Oil More Toxic Than Ever To Fish
Six years after the Deepwater Horizon drilling rig spilled nearly three million barrels of crude oil into the Gulf of Mexico, scientists have found that ultraviolet light
Juvenile mahi-mahi.
is transforming the remaining oil into a more toxic substance that hinders the development of heart, eye, and brain function in fish. The research, led by scientists at the University of California, Riverside and the University of Miami, exposed embryos and larvae of mahi-mahi from the Gulf of Mexico to what they called weathered (exposed to years of sunlight) and un-weathered oil (taken from the drilling site) from the Deepwater Horizon spill in 2010. Compared to fish exposed to un-weathered oil, the fish exposed to the weathered oil experienced impaired eye and neurological function, reduced heart rates, and a buildup of excess fluid in the heart.
Tax Credits Double ProjectionsOf Solar Growth in One U.S. Market
A new market report estimates that U.S. rooftop solar in the Mid-Atlantic region will likely increase exponentially over the next five years thanks to extended federal tax credits. The Business Energy Investment Tax Credit that Congress unexpectedly renewed last December gives homeowners and developers 30 percent back on solar panel installations and other renewable energy investments through 2019. The program could help solar installation growth in the Mid-Atlantic reach over 9,000 megawatts by 2021, doubling previous projections, according to the report, which was conducted by market research firm CreditSights. Such a jump would alleviate the need for U.S. power companies to subsidize electric needs with nuclear, natural gas, or coal during peak energy consumption periods. “If rooftop solar grows more than 30 percent, there’s no reason we couldn’t see electricity demand growth go negative in the coming years,” Greg Jones, a New York-based analyst with CreditSights, told Bloomberg News. PERMALINK
29 Jun 2016:
U.S. Solar Energy Market Experiencing an Unprecedented Boom
Thanks to a renewal of federal tax credits and a continuing steep drop in the price of photovoltaic panels, U.S. solar energy production is surging to record highs. New market reports show that the U.S. solar industry is expected to install 14.5 gigawatts of solar power in 2016, nearly double the record 7.5 gigawatts installed last year. (Less than 1 gigawatt of solar power was installed in 2010.) Revenues from solar installations increased 21 percent from 2014 to 2015, surpassing $22 billion. In terms of megawatts of electricity produced, new solar installations are expected in 2016 to surpass all other new sources, including natural gas-fired power plants. The extension of a 30-percent federal tax credit and a sharp drop in prices — the wholesale price of solar panels has fallen from $4 per watt in 2008 to $0.65 per watt today — are contributing to the boom. U.S.-based Solar World is building a giant solar panel factory in Buffalo, New York that is expected to employ 3,500 people.
U.S., Canada, and Mexico to Set 50 Percent Renewable Power Goal by 2025
The United States, Canada, and Mexico will pledge on Wednesday to generate 50 percent of their electricity from non-fossil fuel sources by 2025, according to U.S. officials. The three nations are expected to set the ambitious goal at a North American Leaders Summit in Ottawa. The commitment includes not just renewable sources of power such as energy and wind, but also hydropower, nuclear power, carbon capture and storage at coal-fired power plants, and gains in energy efficiency. Under that definition, the three nations now produce 37 percent of their electricity from renewable sources. Canada is leading the way in non-fossil fuel power generation, with 59 percent of its electricity coming from hydropower and 16 percent from nuclear plants. Continent-wide cooperation on clean energy issues has improved since the election last year of Justin Trudeau as Canada’s Prime Minister.
Abandoned Coal Mines Emit As Much CO2 as a Small Power Plant
Thousands of abandoned coal mines dot the U.S. landscape, vestiges of old fossil fuel boomtowns and industrial hubs. An abandoned coal mine in Ashland, Penn.
But despite no longer producing coal, these sites are still contributing to climate change by leaking carbon dioxide into the atmosphere, according to a recent study by scientists at West Virginia University. The total amount of CO2 released annually by 140 abandoned sites in Pennsylvania is equal to that “of a small coal-fired power plant,” says the study, published in Environmental Earth Sciences. CO2 is created when sulfuric acid generated during the mining process interacts with carbonate rocks. It is then carried to the surface in runoff water. “Although considerable research has been conducted regarding the environmental legacy of abandoned mine lands, their role in carbon cycling is poorly [understood],” wrote the scientists. The findings “suggest that these waters may be important to carbon cycling on a regional scale.”
2015 Deadliest Year for Environmentalists on Record, Finds Report
Last year was the deadliest year on record for environmentalists, according to a new report from Global Witness, a nonprofit that tracks environmental and human rights abuses worldwide. Indigenous people protest a dam in the Amazon.
One hundred and eighty-five people were killed trying to stop development of land, forests, and rivers in 16 countries in 2015 — equal to more than three people per week. The tally represents a 59 percent increase over 2014, and is double the number of journalists killed in the same period, according to the report. Environmentalists were most at risk in Brazil, the Philippines, and Columbia, which had 50, 33, and 26 killings last year, respectively. “This report sheds light on the acute vulnerability of indigenous people, whose weak land rights and geographic isolation make them particularly exposed to land grabbing for natural resource exploitation,” the Global Witness authors wrote. “In 2015, almost 40% of victims were indigenous.”
Clean Energy Could Cost Up To59 Percent Less by 2025, Report Finds
The cost of solar energy could drop by as much as 59 percent by 2025, from 13 cents to 6 cents per kilowatt hour, according to a new report from the International Renewable Energy Agency. Rooftop solar panels in Hannover, Germany.
Offshore wind could become 35 percent cheaper, and onshore wind 26 percent cheaper, by 2025. The cost of building renewable energy facilities is also likely to fall, by as much as 57 percent by the middle of next decade, the report found. “Historically, cost has been cited as one of the primary barriers to switching from fossil-based energy sources to renewable energy sources, but the narrative has now changed,” Adnan Z. Amin, director-general of IRENA, said in a statement. “To continue driving the energy transition, we must now shift policy focus to support areas that will result in even greater cost declines and thus maximize the tremendous economic opportunity at hand.”
CO2 Crosses 400 ppm For LastTime “Within Our Lifetimes,” Study Warns Atmospheric concentrations of CO2 will stay permanently above 400 parts per million (ppm) this year due to El Nino—and will likely not drop below that number again “within our lifetimes,” according to a study published this week in the journal Nature. CO2 measurements from 1958 to today.
The milestone represents a symbolic threshold that scientists and environmentalists had long sought to avoid. Greenhouse gases have jumped 48 percent from the pre-industrial era, and 29 percent in just the past 60 years, from 315 ppm to 407 ppm today. CO2 concentrations tend to ebb and flow with the seasons, dipping as vegetation grows in summer and increasing during winter. But in the study published in Nature, scientists at the U.K.’s Met Office and University of California, San Diego warned that because of the recent El Nino, CO2 concentrations wouldn’t fall below 400 ppm this year, or any year into the distant future. PERMALINK
Researchers Find a Way to TurnCO2 Into Rock at Iceland Power Plant
Scientists have discovered a new way to successfully capture carbon dioxide and transform it into rock deep underground. The experiment, published in this week’s Science, Section of rock made from mixing CO2 and water.
was conducted at the Hellisheidi power plant in Iceland, the world’s largest geothermal facility. When the plant — which helps power Iceland’s capital, Revkjavik — pumps up volcanically heated water to turbines, gases like carbon dioxide and hydrogen sulfide often come up as well. A team of U.S. and European researchers, led by Columbia University's Lamont-Doherty Earth Observatory, captured the CO2, mixed it with the used volcanic water, and re-injected it into basalt rocks up to a half-mile underground. More than 95 percent of the mixture naturally solidified into carbonate minerals in less than two years. Previous estimates predicted that the process could take hundreds, if not thousands, of years.
More Solar Energy Jobs Exist In U.S. Than in Oil and Gas Sector Solar energy now supports more jobs in the U.S. than either the oil and gas industry or coal mining, according to a new report
from the International Renewable Energy Agency (IRENA). Alex Snyder
Solar jobs grew at a rate 12 times faster than general U.S. job market growth. Worldwide, employment in green energy grew 5 percent in 2015, to 8.1 million jobs, IRENA reported. The 58 percent drop in oil prices since 2014 caused many fossil fuel companies to lay off workers — more than 350,000 people worldwide since the slump began. The IRENA report says clean energy jobs could triple to 24 million by 2030 if nations follow through on the climate pledges they made in Paris last year. “This increase is being driven by declining renewable energy technology costs and enabling policy frameworks,” said Adnan Amin, director-general of IRENA. PERMALINK
Could This Straddling Bus Help Solve China’s Air Pollution Problem?
With an estimated 20 million new drivers on the road each year, China has long struggled to control its CO2 emissions, air pollution, and traffic problems. YouTube/Xinhua
But a Beijing-based transit company is planning to test a new straddling bus this summer that could provide some relief, according to Chinese news agency Xinhua. The bus, which can carry up to 1,400 passengers, hovers above the road, letting smaller vehicles pass underneath. Because it operates on existing roadways, the system is much cheaper to build than underground subways, while carrying the same number of people. The idea of a straddling bus has been around since 1969, but has remained a far-fetched concept until recent years. A model of the system, designed by Transit Explore Bus, was unveiled at the International High-Tech Expo in Beijing this month. The company plans to build and test an actual straddling bus in Changzhou this summer.
World Could Warm 8 Degrees Celsius If All Fossil Fuel Reserves Burned
As nations meet in Bonn, Germany this week to hash out how to achieve the 2-degree Celsius goal they set in Paris, new research is providing policymakers a glimpse of what would happen if the world does nothing to curb climate change. NASA
What if nations chose instead to burn through all of their remaining fossil fuel reserves, equal to 5 trillion tons of CO2 emissions? According to the new study published in the journal Nature Climate Change, the world would warm an average 8 degrees Celsius (14.4 degrees F), or up to 17 degrees Celsius (30 degrees F) in the Arctic. The research was conducted by a team of climate scientists at the University of Victoria and Simon Fraser University in British Columbia who wanted to understand the worst-case scenario. “Such climate changes, if realized, would have extremely profound impacts on ecosystems, human health, agriculture, economies, and other sectors,” the researchers write. PERMALINK
Interview: CO2 'Air Capture' Could Be Key to Slowing Global Warming
For two decades, Klaus Lackner has pioneered efforts to combat climate change by pulling carbon dioxide from the atmosphere. Klaus Lackner
Now, after years of watching the global community fail to bring greenhouse gas emissions under control, Lackner — director of the Center for Negative Carbon Emissions at Arizona State University — is delivering a blunt message: The best hope to avoid major disruptions from global warming is to launch a massive program of CO2 "air capture" that will begin to reverse the buildup of billions of tons of carbon in our atmosphere. "We need to have the ability to walk this backwards," says Lackner. "I'm saying this is a war, and we need to use all the weapons at our disposal. You don't want to get into this fight with one hand tied behind your back."
Read the interview.
Norwegian Company Building The World’s Largest Floating Wind Farm
Scotland is about to get the world’s largest floating wind farm, with five 6-megawatt turbines bobbing 15 miles offshore in the North Sea. The project—which is being developed by Statoil, a Norwegian energy company—marks a shift in offshore wind technology. Most ocean-based turbines to date have been rooted to the sea floor by concrete and steel foundations, similar to oil and gas rigs. But this design limits the projects to shallower waters. Statoil’s floating turbines, known at Hywind, consist of a steel cylinder filled with ballast water and stones, and are tethered to the sea floor by a series of cables. The company, which was granted a lease to build the wind farm this week, will install the Scotland project in water more than 300 feet deep. The plan is for the wind farm to start generating electricity by the end of 2017.
Fumes from Farms Are Top Source of Fine-Particle Pollution
Farms are the number one source of fine-particulate air pollution in the U.S., Europe, Russia, and China, according to new research published in the journal Geophysical Research Letters. Gases from fertilizers and livestock waste cling to emissions from cars, power plants, and factories to create solid particles less than 1/30th the width of human hair. Particles this size have been shown to penetrate deep into lungs, and cause an estimated 3.3 million deaths each year from illnesses like heart and pulmonary disease. Global climate action, however, could reduce this type of air pollution in the coming decades, says the new study, done by three Columbia University scientists. Cutbacks in energy consumption would mean that fumes from farms would have fewer emissions to which they could bond. This reduction in particulates would happen even if fertilizer use increases, the research says.
Despite Push for Renewables, Fossil Fuels Likely to Dominate in 2040
World leaders pledged last year in Paris to cut CO2 emissions and limit global warming to 2 degrees Celsius. Despite these promises, U.S. analysts said Wednesday that fossil fuels
EveryCarListed/Flickr
— including coal — will still likely be the world’s primary source of energy in 2040. The findings are part of the U.S. Energy Information Administration’s annual World Energy Outlook report. Electricity from wind, solar, and hydropower will grow 2.9 percent annually, the report concluded, and by 2040, renewables, coal, and natural gas will each generate one-third of the world’s electricity. But diesel and gasoline will still power the majority of vehicles, with electric cars making up only 1 percent of the market, the report said. The report also found that carbon emissions from energy consumption in the developing world could grow 51 percent from 2012 to 2040 as countries like India and China modernize their economies, particularly by using coal.
Bringing Energy Upgrades To the Nation’s Inner Cities
America’s low-income urban areas are filled with aging buildings that are notoriously energy-inefficient. It’s a problem that Donnel Baird sees as an opportunity. Baird is CEO and cofounder of BlocPower, Donnel Baird
a startup that markets and finances energy-upgrade projects in financially underserved areas. Founded in 2013 with venture capital seed money, BlocPower bundles small energy-improvement projects together — from barber shops to churches —and sells them to potential investors. In an interview with Yale Environment 360, Baird describes how BlocPower’s projects not only create jobs and reduce carbon emissions, but also raise awareness of global warming in inner-city communities. “It is not possible for the climate change movement to win anything significant without the participation of people of color,” says Baird. Read the interview.
Alberta Wildfire Could Unlock Vast Reserves of CO2 from Permafrost A massive wildfire raging in the heart of Canada’s tar sands region has forced 88,000 people from their homes, scorched more than 1,600 buildings, Reuters
and caused several fossil fuel companies to reduce operations and shut down pipelines. The fire — fueled by above-average temperatures and dry conditions linked to climate change — burned through more than 330 square miles of land in Alberta in just a few days. Now, scientists are warning the fire, and the many others like it that Canada has experienced in recent years, could unlock vast reserves of CO2 stored in the region’s underlying permafrost. Fire destroys the protective layer of vegetation that keeps permafrost frozen, and warm conditions spur microbial activity, generating CO2 and methane emissions. “This is carbon that the ecosystem has not seen for thousands of years and now it’s being released into the atmosphere,” Merritt Turetsky, an ecosystem ecologist at the University of Guelph in Ontario, told the New Scientist. PERMALINK
22 Apr 2016:
Brazilian Officials Put aHold on Mega-Dam Project in the Amazon
A proposed 8,000-megawatt hydroelectric dam in the Amazon was put on hold this week by Brazil’s environmental agency out of concerns over its impact on a local indigenous tribe. The São Luiz do Tapajós project — which would be Brazil’s second-largest dam and a cornerstone of government efforts to expand hydroelectric power — would require developers to flood an area the size of New York City and home to thousands of Munduruku people. The environmental agency, Ibama, said they were suspending the project’s licensing because of “the infeasibility of the project from the prospective of indigenous issues.” Brent Millikan, the Amazon program director for International Rivers, told Reuters, "The areas that would have been flooded include sites of important religious and cultural significance. The local communities have a huge amount of knowledge about the resources where they are — if they were forced off the land and into cities they would become unskilled workers."
Entries Invited for Third Annual Yale Environment 360 Video Contest
The third annual Yale Environment 360 Video Contest is now accepting entries. The contest honors the year's best environmental videos. Submissions must focus on an environmental issue or theme, have not been widely viewed online, and be a maximum of 15 minutes in length. Videos that are funded by an organization or company and are primarily about that organization or company are not eligible. The first-place winner will receive $2,000, and two runners-up will each receive $500. The winning entries will be posted on Yale Environment 360. The contest judges will be Yale Environment 360 editor Roger Cohn, New Yorker writer and e360 contributor Elizabeth Kolbert, and documentary filmmaker Thomas Lennon. Deadline for entries is June 10, 2016. Read More. | 科技 |
2016-40/3982/en_head.json.gz/17655 | DISA data centers to play host to VA-DoD health records system
Obama appoints Todd Park to federal CTO role
Home » Francis Rose » In Depth » Rising gas prices add… Rising gas prices add more volatility to DoD budget
By Jason Miller | @jmillerWFED March 9, 2012 10:09 am Share
Federal News Radio Executive Editor Jason Miller's report
http://federalnewsradio.com/wp-content/uploads/2012/03/239322.mp3Download audio The spike in gas prices is wreaking havoc, once again, on the Defense Department’s budget.
DoD planned on spending about $88 for a barrel of oil in 2012, but as of Wednesday, the commodity traded at $107 a barrel. For every dollar above $88, it costs the Pentagon $31 million. Robert Hale, the DoD comptroller, said if the price of oil stays this high, DoD will have to dig even deeper to find the money. Register for the Ask the CIO Chat with Andy Ozment of the Homeland Security Department on Oct. 11, at 1:30 p.m.
“We do what’s called a mid-year review. We’ll look first at any operating accounts that are under executing. But frequently the sources come from the investment accounts,” Hale said Thursday during the 2012 Pentagon Conference sponsored by Credit Suisse in Arlington, Va. “We look at unobligated balances, hopefully at systems where we don’t do too much damage to the plan. But you don’t want to do this if you can avoid it.” Advertisement
Hale said DoD spends about $17 billion a year on gas, and if gas continues to be 25 percent higher than it budgeted, the military will have a serious budget problem. Moving to alternative fuels among highest priorities The Pentagon placed moving to alternative fuels among its highest priorities. Last June, DoD sent its first operational energy strategy to Congress detailing how it will use more non-petroleum-based fuels. DoD accounts for 70 percent of all the energy purchased and used by the government. Among the goals DoD has laid out for the services is to cut fuel consumption by 50 percent at bases by 2013. Each of the services is taking on the challenge to figure out how to reduce their fuel consumption. The Navy is among the leaders. Secretary Ray Mabus said the service is doing several things, including expanding its test of a hybrid engine for ships. Mabus said the USS Makin Island already has shown an electric/gas engine could work on an amphibious ship-the electric engine for speeds 12 knots per hour or below and gas for speeds above 12 knots per hour. Now the service wants to test the engine more on a Destroyer. Mabus said the Navy also will conduct an exercise in July where the fleet and planes will be run only on alternative energy, nuclear or biofuels. The Marines Corps also has found success by using solar energy to cut fuel consumption at forward operating bases by 25 percent in Afghanistan. Units are also experimenting at Quantico in Virginia and 29 Palms in California with alternative fuels so the warfighter doesn’t have to depend on convoys for re-supply as often. These convoys are among the most dangerous missions the military undergoes. “When the uprising in Libya happened, the price of a barrel of oil went way up and I had only one place to go to pay for it, operational accounts which meant less training, less time patrolling and less time meeting our mission,” Mabus said. The Navy is experimenting with solar, wind, geothermal, hydrothermal and even developing a microgrid for electricity just in case there is ever a problem with the commercial grid. Additionally, the Navy is partnering with the Energy and Agriculture departments to spend $500 million on research and development of alternative fuels. Budget 9 percent less than planned The move to alternative fuels still is years away so in the short term Hale and other senior leaders said DoD would double down to find savings not only to pay for the increased costs for energy, but because they have to. Ashton Carter, the Defense deputy secretary, said the military’s budget is 9 percent less than they had planned for it to be. Carter said services have been looking and would continue to look in “every nook and cranny” for savings. DoD has committed to saving or avoiding spending $259 billion over the next five years and $489 billion over the next 10 years. The military said a part of that $259 billion — about $60 billion — would come from efficiency savings around technology, acquisition and other back office administrative services. Hale said each of the services have goals to reduce spending. Hale and Beth McGrath, the deputy chief management officer, oversee the process and are doing periodic reviews of military service and agency progress. “Last year we had some things that were just plans or commitments, same thing this year,” Hale said. “We’ve taken last year and made them specific, and we will do the same thing again. We recognize we’ve got to do it and I think the services are fully on board.” But he said DoD would have a harder time finding areas to cut spending as the services addressed many of the “low-hanging fruit” in the first round. “The civilian personnel cap has been a difficult one, especially a year ago because we weren’t coming down very much in end strength,” Hale said describing the challenges of further efficiencies. “Maybe it will be a little easier now that we are taking units out. We’ve tried to make some reductions in our contractor workforce without demonizing contractors; we have to have them. It’s hard to measure them. We don’t have very good data systems.” One area where he doesn’t foresee further reductions beyond what was proposed in the fiscal 2013 budget request for DoD civilian or military personnel. Squeezing more out of acquisition Carter said acquisition continues to be one of the most obvious places to find savings, but not necessarily by cutting contracts. Carter said DoD and industry need to change their processes. “We are looking for better value in services. There is no question about that,” Carter said. “That has a lot of different dimensions, and contract type is just one. That has to do with how we do requirements, how disciplined we are, how good our people are in conceiving requirements. It has to do with recompetes and frequency thereof and quality thereof and lots of other things. We need to improve our tradecraft in the acquisition of services. We are doing better. You can see that in some of our statistics that we track. We are counting on doing better in the future. When I say we have some of that taken into account in our budget plans, we really do.” He added DoD will look for savings by using more small and medium sized businesses, trying to get rid of cumbersome bureaucracy, emphasizing exports and trying to benefit from globalization. “Poorly performing programs will not survive in this environment,” Carter said. “That makes our Better Buying initiative more important and we are continuing to press forward. We understand in a broadest kind of analogy our industry will need to make structure adjustments in view of the circumstances we jointly find ourselves. Our philosophy is this, in the main we will rely on normal market forces to make the most efficient adjustments in the defense industrial base.” Also, DoD wants agencies and vendors to sharpen their pencils when it comes to figuring out cost. “We’ve got this standardized budgeting process now called ‘will cost,’ and we need to look at what could be the cost and what will it be based on past history,” said David Van Buren, the Air Force’s service acquisition executive. “But the charge Dr. Carter gave us is to do better than that and give the money back to either the taxpayer or to the service chiefs and secretaries for use in more high priority programs. The whole cost activity is working quite well in the building.” He said five of the service’s major programs have or are undergoing this, including the KCX-Tanker program. In fact, Van Buren said the tanker is meeting cost, schedule and performance goals and has not required one change order in the first year. RELATED STORIES: DoD to treat energy as a critical military capability Navy explores alternative energy sources Navy, USDA, DoE issue big biofuels RFI DoD to quadruple TRICARE fees for higher-earning retirees Topics:
Home » Francis Rose » In Depth » Rising gas prices add… Partners | 科技 |
2016-40/3982/en_head.json.gz/17787 | Plans are in place for massive solar project
By Allie Krug
Updated: July 3, 2014, 1:52 pm
ST. ANSGAR, Iowa – A north Iowa town will be the future home to one of the largest solar projects in the state.
The Heartland Power Cooperative in St. Ansgar is working on bringing more than 1,000 solar panels into the area.
Jon Leerar, the CEO and general manager of the company tells us that panels will span across 4 ½ acres of land and bring solar energy to more than 5 thousand of their customers.
It’s an idea that has been talked about for a while now, and Leerar tells us it’s something the co-op members have been asking about getting involved in.
Because the technology for solar energy has been around for a couple of years, Leear says it is a perfect time to invest.
“Solar itself has become lower in price, and it has come down in terms of generating and distributing to our members .It’s really become a renewable energy that is a lot more cost-effective.”
They’re hoping to have construction of the solar panels complete by late fall and that things will be up and running sometime in 2015.
2-county natural gas pipeline project gains support
Mason City Public Library unveils solar panels
US Senator Al Franken is hoping to improve the farm bill | 科技 |
2016-40/3982/en_head.json.gz/17806 | Help Answer Phones!
Drought Tests the Rio Grande By Laura Paskus
Apr 5, 2013 ShareTwitter Facebook Google+ Email Dr. Clifford Dahm along the Middle Rio Grande
Laura Paskus
Hano Hano Hawaii Credit Laura Paskus Listen Listening... / 3:11 Audio Postcard Editor's Note: This piece originally aired in April, 2013 on KUNM. The muddy waters of the Rio Grande are still flowing through Albuquerque. But New Mexico is in the grip of long-term drought and there’s little water left in upstream reservoirs. That means this summer will probably be like last year—when 52 miles of the Rio Grande dried up south of Albuquerque. Laura Paskus headed out to take a look with one of the world’s leading experts on desert rivers and sent us this audio postcard. ----- We’re walking along the edge of the Rio Grande about 15 miles south of Albuquerque. “We probably don’t want to go walk out there then, unless you want to get up to your knees in mud.” That’s Clifford Dahm, a biology professor at the University of New Mexico. This stretch of the river was dry last summer and fall. During irrigation season, much of the Rio Grande is channeled into ditches, then onto fields and yards. Irrigation season has already begun for the year, but for now, there’s still water in the riverbed—and also welling up from beneath the surface. “I suspect that if we walked right out into this area, where the water is starting to show up,” says Dahm, “we would find that it would liquefy under us.” The river and the groundwater beneath are connected. They’re two parts of one system. Dahm points to a “backwater” – a puddle the length of a truck. “Here’s an area that clearly the groundwater is seeping in, couple different locations, starting to produce a little bit of flow. It’s very slow as you can see.” It’s fed by a trickle of water coming from beneath the ground. The puddle isn’t muddy like the river. Instead, the water is clear. Like a bathtub ring, green algae circles the edge of the puddle and provides food for bugs and fish. The groundwater, river water, algae, fish, birds: everything is connected. Including the trees towering above the bank. “Most of the large cottonwoods that you see along the Rio Grande come from some major floods that occurred in 1921 and in 1941,” says Dahm. “So that when people have attempted to date these trees, they find that many of them are 70 or 90 years old.” Those floods last century sent water churning down the river and across the floodplain. They ran probably 20,000 cubic feet per second. That couldn’t happen today. Today, a big flood might run a quarter of that or about 6,000 cubic feet per second. That’s because today the river’s water is moved downstream from one reservoir to the next. And, we’re in a drought. “The current drought that we’ve been in is pretty remarkable,” says Dahm, “both in terms of the lack of water we’ve had for the past two years and for the temperatures that have been associated.” These days, flows like the ones that spawned the cottonwoods are more like wishful thinking: This spring the Rio Grande through Albuquerque’s been running around 400 - 500 cubic feet per second. Dahm says this meager flow is serious. “Our water supplies have been very much stressed, and larger and larger parts of the river are now without water—and they’re without water for longer periods of time.” People still talk about the 1950s drought. But it’s just as dry today as it was back then. Dahm says the river ecosystem is adaptive. But just how adaptive is a question that lingers. Related Content
Rio Grande could join 'ghost rivers' of the Southwest 3 years ago Rare fish not faring well in the Rio Grande 3 years ago Saving fish from a drying river 4 years ago Low Flows on the Rio Grande 4 years ago View the discussion thread. © 2016 KUNM | 科技 |
2016-40/3982/en_head.json.gz/17880 | + NASA en Español
+ Marte en Español
NASA Spacecraft Heads for Polar Region on Mars
A Delta II rocket lit up the early morning sky over Cape Canaveral Air Force Station in Florida as it carried the Phoenix spacecraft on the first leg of its journey to Mars. The powerful three-stage rocket with nine solid rocket motors lifted off at 5:26 a.m. EDT. Image Credit: NASA View larger image CAPE CANAVERAL, Fla. - NASA's Phoenix Mars Mission blasted off Saturday, aiming for a May 25, 2008, arrival at the Red Planet and a close-up examination of the surface of the northern polar region.
Perched atop a Delta II rocket, the spacecraft left Cape Canaveral Air Force Base at 5:26 a.m. Eastern Time into the predawn sky above Florida's Atlantic coast.
"Today's launch is the first step in the long journey to the surface of Mars. We certainly are excited about launching, but we still are concerned about our actual landing, the most difficult step of this mission," said Phoenix Principal Investigator Peter Smith of the University of Arizona's Lunar and Planetary Laboratory, Tucson.
The spacecraft established communications with its ground team via the Goldstone, Calif., antenna station of NASA's Deep Space Network at 7:02 a.m. Eastern Time, after separating from the third stage of the launch vehicle. "The launch team did a spectacular job getting us on the way," said Barry Goldstein, Phoenix project manager at NASA's Jet Propulsion Laboratory, Pasadena, Calif. "Our trajectory is still being evaluated in detail; however we are well within expected limits for a successful journey to the red planet. We are all thrilled!"
Phoenix will be the first mission to touch water-ice on Mars. Its robotic arm will dig to an icy layer believed to lie just beneath the surface. The mission will study the history of the water in the ice, monitor weather of the polar region, and investigate whether the subsurface environment in the far-northern plains of Mars has ever been favorable for sustaining microbial life. "Water is central to every type of study we will conduct on Mars," Smith said.
The Phoenix Mars Mission is the first of NASA's competitively proposed and selected Mars Scout missions, supplementing the agency's core Mars Exploration Program, whose theme is "follow the water." The University of Arizona was selected to lead the mission in August 2003 and is the first public university to lead a Mars exploration mission.
Phoenix uses the main body of a lander originally made for a 2001 mission that was cancelled before launch. "During the past year we have run Phoenix through a rigorous testing regimen," said Ed Sedivy, Phoenix spacecraft program manager for Lockheed Martin Space Systems, Denver, which built the spacecraft. "The testing approach runs the spacecraft and integrated instruments through actual mission sequences, allowing us to asses the entire system through the life of the mission while here on Earth." Samples of soil and ice collected by the lander's robotic arm will be analyzed by instruments mounted on the deck. One key instrument will check for water and carbon-containing compounds by heating soil samples in tiny ovens and examining the vapors that are given off. Another will test soil samples by adding water and analyzing the dissolution products. Cameras and microscopes will provide information on scales spanning 10 powers of 10, from features that could fit by the hundreds into a period at the end of a sentence to an aerial view taken during descent. A weather station will provide information about atmospheric processes in the arctic region.
The Phoenix mission is led by Smith, with project management at JPL and development partnership at Lockheed Martin, Denver. The NASA Launch Services Program at Kennedy Space Center and the United Launch Alliance are responsible for the Delta II launch service. International contributions are provided by the Canadian Space Agency, the University of Neuchatel (Switzerland), the University of Copenhagen (Denmark), the Max Planck Institute (Germany) and the Finnish Meteorological Institute. JPL is a division of the California Institute of Technology in Pasadena. Additional information on Phoenix is available online at: http://www.nasa.gov/phoenix. ###
Guy Webster
Jet Propulsion Laboratory, Pasadena, Calif. 818-354-5011
[email protected]
George Diller
Kennedy Space Center, Florida 321-867-2468
[email protected]
Sara Hammond
University of Arizona, Tucson
[email protected]
NEWS RELEASE: 2007-086
+ Freedom of Information Act
+ The President's Management Agenda
+ FY 2002 Agency Performance and
+ NASA Privacy Statement, Disclaimer, and
Accessiblity Certification
+ Freedom to Manage | 科技 |
2016-40/3982/en_head.json.gz/17894 | Detection of apple juices and cereals which exceed permitted levels of mycotoxins
June 7, 2013 Researchers from the University of Granada (Spain) have analysed the presence of patulin, a type of toxin produced by fungi, in several commercial apple juices. The results show that more than 50% of the samples analysed exceed the maximum limits laid down by law. They have also discovered a sample of rice with more mycotoxins than permitted. For their part, researchers from the University of Valencia have also found these harmful substances in beers, cereals and products made from them, such as gofio flour.
They are not very well known, but mycotoxins top the list of the most widespread natural contaminants in foodstuffs at the global level. They are toxic and carcinogenic substances produced by fungi, which reach the food chain through plants and their fruit. Now new analytical techniques developed in universities such as Granada and Valencia (Spain) show that some foodstuffs exceed permitted levels of these harmful compounds.
Researchers from the University of Granada (UGR) have used their own method of 'microextraction and capillary electrophoresis' to analyse concentrations of a kind of mycotoxins, patulin, in 19 batches of eight brands of commercial apple juice. They differentiated between conventional juice, organic juice and juice designed specifically for children.
"The results show that more than 50% of the samples analysed exceeded the maximum contents laid down by European law," as explained to SINC by Monsalud del Olmo, co-author of the study, which is published this month in the magazine Food Control.
The maximum levels of patulin established by the EU are 50 micrograms per kilogram of product (μg/kg) for fruit juices and nectars, 25 μg/kg for compotes and other solid apple products and 10 μg/kg if those foodstuffs are aimed at breast-fed babies and young children.
However, some samples of conventional apple juices had as much as 114.4 μg/kg, and one batch labelled as baby food had 162.2 μg/kg, more than 15 times the legal limit.
Patulin is produced by several species of fungi of the Penicillium, Aspergillus and Byssochylamys varieties, which are found naturally in fruit, mainly apples. They are transferred to juices during processing because of their solubility in water and stability. The neurotoxic, immunotoxic and mutagenic effects of this substance have been confirmed in animal models. "Even then, it is not one of the most dangerous mycotoxins for health and it is included in group 3 within the categories laid down by the International Agency for Research on Cancer (IARC)," Monsalud del Olmo pointed out.
This WHO agency classifies mycotoxins and other compounds in four groups according to their carcinogenic potential for humans: 1 (carcinogenic), 2 (probably or possibly carcinogenic), 3 (not classifiable as carcinogenic, although it has not been proven that it is not) and 4 (probably not carcinogenic).
Some mycotoxins, such as aflatoxins, are in group 1 and can be found in dry fruit, such as peanuts and pistachios, and cereals. UGR scientists have also detected concentrations of this compound above the permitted levels in a sample of rice, and they have already informed the relevant authorities of this.
Other toxins from fungi, such as fumonisins and ochratoxins, are also included in group 2. They are found in maize, other cereals and even beer, as researchers from the University of Valencia (UV) have proven.
Mycotoxins in beer
A team from that university has used a new technique – called HLPC-LTQ-Orbitrap – to detect the presence of fumonisins and ochratoxins in samples of beer in Germany, Belgium, the Czech Republic, Italy, Ireland, Poland and Spain. The study is also published in 'Food Control'.
"They are minute quantities, although we cannot determine whether they are important because beer is one of the drinks which is not directly included in European law on mycotoxins," said Josep Rubert, UV researcher and co-author of the study.
"What this study does show is that merely controlling the raw material – barley, in this case – is not enough," added Rubert, "and that these toxins are present throughout the technological process, where it has been proven that mycotoxins that are legislated for can become hidden by joining wit glucose, so this needs to be taken into account for future research".
The same Valencian team has also analysed 1250 samples of cereal-based products from Spain, France and Germany to see whether there are differences between organic and conventional foodstuffs in the case of fumosins.
One of the most striking findings is that samples of gofio flour, commonly used in the Canaries, had concentrations of this mycotoxin in quantities greater than 1000 μg/kg, the limit established by European law. A couple of years ago, those researchers also identified a consignment of wheat flour with concentrations of ochratoxin above the permitted level.
When the limits laid down by the EU are exceeded, scientists inform the relevant authorities, especially the European Food Safety Authority (EFSA). Then the contaminated batch must be withdrawn.
The results of the study of cereal-based foodstuffs show that almost 11% of the organic products examined contain fumosins, whereas in conventional products this percentage is reduced to around 3.5%. This data has been published in the magazine Food and Chemical Toxicology.
"The explanation could be that organic foodstuffs do not contain fungicides or other pesticides, so fungi may have a more favourable environment and increase their toxins. However, in any case, there are other important factors such as climatic conditions – heat and humidity benefit these microorganisms – and storage conditions which also influence the production of mycotoxins," said Rubert, who recognises that analysis must be done on a case-by-case basis.
In fact, in the study of apple juices, the opposite happened, and the organic products had fewer mycotoxins than the conventional ones. What the researchers do agree on is the need to keep defining the toxicity of each of these harmful substances, studying their effects on health and developing more and more exact methods of analysis.
Explore further: Some cheeses exceed contaminant levels recommended by EU
Victor-Ortega, M. et al. Evaluation of dispersive liquid-liquid microextraction for the determination of patulin in apple juices using micellar electrokinetic capillary chromatography, Food Control 31: 353-358, 2013.
Rubert, J. et al. Mass spectrometry strategies for mycotoxins analysis in European beers, Food Control 30 (1): 122–128, 2013.
Rubert, J. et al. Occurrence of fumonisins in organic and conventional cereal-based products commercialized in France, Germany and Spain, Food and Chemical Toxicology, 2013.
Plataforma SINC
Some cheeses exceed contaminant levels recommended by EU
Researchers at the University of Las Palmas de Gran Canaria (Spain) have analysed more than 60 brands of cheese commonly available in supermarkets. The concentration of organochloride contaminants in the majority of the samples ...
Denmark warns against rice for children
Denmark's Veterinary and Food Administration said Wednesday that parents should stop giving their children rice cakes and rice milk, saying the products contained unacceptable levels of inorganic arsenic. | 科技 |
2016-40/3982/en_head.json.gz/17909 | Carmen Shields / Creative Commons
Researchers: 150-year-old technology could provide ‘clean’ coal solution
Written By Kari Lydersen06/16/2016
NOTE TO READERS: This story has been updated to remove information provided by a source who had misrepresented his credentials to Midwest Energy News and other organizations.
As coal advocates seek to keep their industry viable amid tighter restrictions on carbon emissions, some say a new spin on a 150-year-old technology might hold the solution.
Doug Gagnon evaluates new technologies for the American Coal Council and also for a company that licenses air pollution control equipment and then provides it to customers. He thinks that a company called Carbon Conversion International (CCI) could have a promising solution, especially after recent tests of a pilot project at an independent power plant in northwest Pennsylvania.
CCI’s technology passes emissions through a non-thermal plasma field to extract carbon dioxide, carbon monoxide, sulfur dioxide and nitrogen oxide. The emissions could come from coal-burning power plants, diesel engines or other fossil fuel-burning operations. What is left of the effluent (or emissions) are harmless elemental gases that can be run back through the engine or generator, boosting its efficiency.
Meanwhile the carbon is concentrated into its nearly elemental form, known as carbon black, which is sold on the market where it is used for tires, rubber, plastics, printing inks and other applications.
Company officials describe the technology as a simple solution that has long been studied and considered possible in theory, but until recently took so much energy that it was not profitable. Typically extracting CO2 or other pollutants would essentially significantly reduce the overall output of the power plant. Generally this “parasitic load” – or energy use for pollution controls as a percent of total energy generation – would be roughly 30 to 50 percent.
Test results from pilot projects have shown CCI’s technology can operate with a parasitic load around just 4-5 percent.
Gagnon said that he is impressed by CCI’s test results and sees the technology as a promising way to reduce carbon emissions from coal plants and also from natural gas plants.
“I have reviewed the test data and the calculations are indeed correct,” said Gagnon, noting that he still plans to review more data himself. “When I look at the people who actually did the [independent testing] work, it is clear that they are highly respected, experienced and very reputable.”
Gagnon noted that amine scrubbers typically used to remove carbon from emissions and other types of pollution control equipment might have a 37 percent parasitic load.
“So, if a 100 MW plant needs to treat 10 percent of its flue gas to bring its carbon dioxide down into compliance, then without even considering debt service, maintenance or manpower costs, the 37 percent parasitic load for the portion of the flue gas being treated will add 3.8 percent to the power costs,” Gagnon explained. Under the new Clean Power Plan, carbon limits could mean a 15 percent increase in the price of power using amine scrubbers, Gagnon said.
“The prospect of a carbon dioxide scrubber that consumes 37 percent of the energy load will put a lot of coal plants out of business,” he said. By contrast, “the parasitic load for CCI’s technology is about 10 percent of what the other technology would be. In the example above, if we just consider parasitic load, this would be a 0.4 percent [cost] increase for 10 percent of the flue gas and approximately a 1.5 percent increase if 40 percent of the flue gas had to be treated. This is really very special technology.”
Non-thermal plasma refers to a situation where molecules are exposed to large amounts of electricity that puts them in a state similar to when they are heated to very high temperatures. But in the non-thermal version, the molecules are at room temperature.
A 2005 paper published by the U.S. Environmental Protection Agency describes non-thermal plasma and how it has been used to extract pollutants from air streams. It noted that regulations on mercury pollution and on vehicle emissions had sparked renewed interest in the potential of non-thermal plasma.
“Non-thermal plasma has been around a long time,” the paper noted. “It was observed in a laboratory over a hundred and fifty years ago. It enjoyed original success for many years by making ozone from air and water. It was researched to death, but commercially, it remained mainly a laboratory curiosity.”
The key to CCI’s success was developing a process that is economically viable.
In a pilot at the Pennsylvania plant, testing by a third party showed CCI’s technology reduced carbon dioxide emissions by 91 percent, nitrogen oxide emissions by 86 percent, sulfur dioxide by 88 percent, and carbon monoxide by 75 percent, with a parasitic draw of just 3.7 percent.
A previous third-party test in Pennsylvania had shown positive results in reducing emissions and increasing efficiency in a Caterpillar wheel loader of the type used on construction sites. The U.S. EPA has mandated that such equipment reduce their emissions. That test showed the wheel loader had carbon dioxide emissions reductions of 50-100 percent, nitrogen oxide reductions of 79-83 percent and carbon monoxide reductions of 84-92 percent.
The company says that along with construction equipment and power plants, the technology can be used in many other commercial and industrial applications. The technology was also tested at a water treatment facility in Wilmington, Delaware.
Though the technology would still need to be scaled up for large commercial applications, company officials say indications point to an “attractive return on investment compared with other current technologies.”
The CCI team commented on their achievement: “We are extremely excited at the results we have achieved after years of development. This is truly breakthrough technology that will have a positive economic impact on the power generation industry. As we continue to expose the technology to industry experts, it becomes clear that it will be applicable beyond generation in many other current processes and industries.”
Newsclean coalcoaltechnology Read Next
With bankruptcy looming, Illinois mine exemplifies self-bonding fears
As Peabody Energy sits on the verge of bankruptcy, neighbors of a southern Illinois coal mine fear both for their economic and environmental future.
8 thoughts on “Researchers: 150-year-old technology could provide ‘clean’ coal solution” martin gugino on 06/16/2016 at 11:39 am said:
Gak. What exactly happens to the carbon? Does it come out as carbon dust?
Ken Paulman on 06/16/2016 at 1:54 pm said:
It’s in the story – it’s recaptured as carbon black, which is used in a variety of products.
moe mongo on 06/16/2016 at 12:40 pm said:
nu such thing as clean coal, look at ashes, air, mt top removal and you tell how clean it is
Rex Havoc on 06/16/2016 at 2:10 pm said:
No, no ash, no dirty air, you must be looking at 1950s pictures. I live near a 3 unit coal fired power plant that has NO visible ash or smoke, only water vapor. We are known for our amazing clear skies and gorgeous mountains and large quantities of animal life, both small and big game. Stop believing everything you are told you should believe and do your own research.
Daniel Mitchell on 06/16/2016 at 4:51 pm said:
I believe he was referring to the coal ash left after the coal is burned. This is the same stuff that they recently had that accident with in one of the Carolinas.
CleanLung on 06/16/2016 at 4:36 pm said:
As irrelevant as clean whale oil. There’s plenty of sun to power everything.
Richard Cohen on 06/16/2016 at 5:18 pm said:
This process appears to violate the First and Second Laws of Thermodynamics. When you burn coal (carbon), you combine the carbon with oxygen, making carbon dioxide. This process releases a large amount of energy. To split the carbon dioxide apart into carbon and oxygen, as this article suggests this “process” does, requires an even GREATER amount of energy to be put back in. Thus the whole “process” consumes, rather than produces, energy. for this reason, the article is a hoax.
John Bradley on 06/16/2016 at 8:17 pm said:
Hopefully it will put some coal miners back to work. I have mined the past 8yrs and im now going back to college to be a RN. Many of the men I have worked with are having to leave WV to feed there familys. The school system here had to lay off tons of teachers no money coming in to pay them. Our county trash dump shut down due to funding now many are dumping there trash in our beautiful mountains, real win for the enviromentalists. WV has fed your powerplants for generations now you want to turn your back on the men going in to the depths to feed there familys and mine the product that fires the power plants and makes our steel. | 科技 |
2016-40/3982/en_head.json.gz/17959 | Page last updated at 16:03 GMT, Saturday, 6 December 2008
Sci-fi 'creator' Ackerman dies Forrest Ackerman's love affair with sci-fi began when he was a small boy Forrest Ackerman, a writer and editor credited with discovering the author Ray Bradbury and coining the term "sci-fi", has died, aged 92.
Ackerman died of heart failure at his home in Los Angeles, said a spokesman.
Ackerman's achievements included founding the sci-fi pulp magazine Famous Monsters of Filmland.
But he is probably best known for finding Bradbury, author of The Martian Chronicles, when looking for people to join a sci-fi club he was starting up.
Ackerman was also the owner of a huge private collection of science-fiction movie and literary memorabilia.
"He became the Pied Piper, the spiritual leader, of everything science fiction, fantasy and horror," said Kevin Burns, trustee of Ackerman's estate.
After finding the then teenage Bradbury, Ackerman went on to give him the money to start his own science-fiction magazine Futuria Fantasia.
'Never catch on'
He also paid for Bradbury to go to New York for a writers' meeting that the author said helped launch his career.
"I hadn't published yet, and I met a lot of these people who encouraged me and helped me get my career started, and that was all because of Forry Ackerman," Bradbury told the Associated Press news agency in 2005.
As a literary agent, Ackerman represented Bradbury, Isaac Asimov and numerous other science-fiction writers.
He said the term "sci-fi" came to him in 1954 when he was listening to a car radio and heard an announcer mention the word "hi-fi."
"My dear wife said, 'Forget it, Forry, it will never catch on,"' he said.
He began using the term in his magazine Famous Monsters of Filmland, which he helped create in 1958 and edited for 25 years.
Ackerman also appeared in many films including Queen of Blood, Dracula Vs Frankenstein and Amazon Women on the Moon, to name but a few.
Ackerman once said he fell in love with science fiction when he was nine years-old and saw a magazine called Amazing Stories, which he kept for the rest of his life.
Bradbury excited by Mars effort
16 Jan 04 | Science & Environment
Author anger at Moore film title
Forrest Ackerman on MySpace The BBC is not responsible for the content of external internet sites | 科技 |
2016-40/3982/en_head.json.gz/17961 | EDITORS' BLOG Last Updated: Thursday, 3 August 2006, 20:50 GMT 21:50 UK
Strange 'twin' new worlds found
The planemo twins: Two peculiar planet-like worlds
A pair of strange new worlds that blur the boundaries between planets and stars have been discovered beyond our Solar System.
A few dozen such objects have been identified in recent years but this is the first set of "twins".
Dubbed "planemos", they circle each other rather than orbiting a star.
Their existence challenges current theories about the formation of planets and stars, astronomers report in the journal Science.
"This is a truly remarkable pair of twins - each having only about 1% the mass of our Sun," said Ray Jayawardhana of the University of Toronto, co-author of the Science paper.
"Its mere existence is a surprise, and its origin and fate a bit of a mystery."
'Double planet'
The pair belongs to what some astronomers believe is a new class of planet-like objects floating through space; so-called planetary mass objects, or "planemos", which are not bound to stars.
Now we're curious to find out whether such pairs are common or rare. The answer could shed light on how free-floating planetary-mass objects form
Valentin Ivanov
They appear to have been forged from a contracting gas cloud, in a similar way to stars, but are much too cool to be true stars.
And while they have similar masses to many of the giant planets discovered beyond our Solar System (the larger weighs in at 14 times the mass of Jupiter and the other is about seven times more massive), they are not thought to be true planets either.
"We are resisting the temptation to call it a 'double planet' because this pair probably didn't form the way that planets in our Solar System did," said co-researcher Valentin Ivanov of the European Southern Observatory (ESO) in Santiago, Chile.
'Amazing diversity'
The two objects have similar spectra and colours, suggesting that they formed at the same time about a million years ago.
They are separated by about six times the distance between the Sun and Pluto, and can be found in the Ophiuchus star-forming region some 400 light years away. They go under the official name Oph 162225-240515, or Oph 1622 for short.
"Recent discoveries have revealed an amazing diversity of worlds out there," said Dr Jayawardhana. "Still, the Oph 1622 pair stands out as one of the most intriguing, if not peculiar."
His colleague, Dr Ivanov, said they were curious to find out whether such pairs are common or rare. "The answer could shed light on how free-floating planetary-mass objects form," he added. Oph 1622 was discovered using the ESO's New Technology Telescope at La Silla, Chile. Follow-up studies were conducted with the ESO's Very Large Telescope.
Mini-planet systems get stranger
06 Jun 06 | Science/Nature
Plenty of Earths await discovery
05 Apr 05 | Science/Nature
Mini-planet system seen in growth
08 Feb 05 | Science/Nature | 科技 |
2016-40/3982/en_head.json.gz/17967 | http://news.nationalgeographic.com/news/2010/03/100303-dinosaurs-older-than-thought-10-million.html
Dinosaurs Ten Million Years Older Than Thought
The oldest known dinosaur relative, Asilisaurus kongwe, appears with an early sail-backed dinosaur in an artist's rendering.
Illustration courtesy Marlene Hill Donnelly, Field Museum
A new dinosaur relative found in Tanzania is the oldest known creature of its kind—a discovery that pushes back the origin of dinosaurs by at least ten million years, paleontologists say.
The Power of One Person to Change the World This Dinosaur Wore Camouflage This Paleontologist Is on a Mission to Teach Mongolians About Their Own Dinosaurs Dubbed Asilisaurus kongwe, the Labrador retriever-size creature was a silesaur, the closest relatives to true dinosaurs. The newfound animal lived 243 million years ago, during the middle Triassic period.
Since silesaurs and true dinosaurs diverged from a common ancestor, the two groups should have existed during the same time frame. But the oldest known true dinosaurs date back to just 230 million years ago.
Finding a silesaur that's ten million years older makes "a big difference," said study co-author Christian Sidor, curator of vertebrate paleontology at the Burke Museum of Natural History and Culture in Seattle.
Asilisaurus's age suggests that some early forms of dinosaurs must have also been plodding around in the middle Triassic. (See a prehistoric time line.)
"When people think about dinosaurs, we think about the extreme forms, the ones that have gone off in their ... own weird directions," such as Stegosaurus and Tyrannosaurus rex, said Thomas R. Holtz, Jr., a vertebrate paleontologist at the University of Maryland, who was not involved in the research.
"But they all came from a common ancestor, and fellows like Asilisaurus help us understand what that original dino state looked like." Early Dinosaur Relative an Omnivore?
Parts of at least 12 Asilisaurus skeletons were found in 2007 in southern Tanzania's Ruhuhu Valley. With no intact specimen to study, Sidor and his team had to piece together a composite skeleton.
What emerged looked nothing like what paleontologists had imagined.
Instead of resembling the "typical hatchet-headed, blade-toothed meat-eaters," Asilisaurus was a light, slender-limbed animal with peg-like teeth and a small beak-like structure on its jaw, the University of Maryland's Holtz said.
The shape of the beak suggests Asilisaurus tore into tissue, which means the animal might have eaten plants—or both plants and meat, the researchers say.
Mysteriously, the long-tailed animal ran on all fours—even though the vast majority of early dinosaurs were two-legged carnivores, according to the study, to be published tomorrow in the journal Nature.
"It's making the picture a little bit murkier, because we have a possible herbivore and quadruped very close to the dinosaur lineage," said study co-author Sidor, who received funding for the research from the National Geographic Society's Committee for Research and Exploration. (The National Geographic Society owns National Geographic News.)
Dinosaurs' World in Transition
In a way, Asilisaurus's discovery "is an elegant fulfillment of a prediction," said Christopher Brochu, a vertebrate paleontologist at the University of Iowa who also wasn't part of the study group.
That's because paleontologists know dinosaurs belong to the archosaurs. Within this group, dinosaur ancestors are divided into two main branches: a line that includes silesaurs and shares skeletal features with modern birds, and a line that has more in common with crocodilians.
Since a crocodilian dinosaur-ancestor from the middle Triassic had already been found, unearthing an animal like Asilisaurus was just a matter of time.
But the new research goes beyond that, Brochu said. "It's part of a larger, growing realization that the earliest archosaurs were far more diverse than we ever thought." (Take a dinosaur quiz.)
In addition to finding Asilisaurus, study leader Sidor and colleagues have already collected several other archosaur species from the Ruhuhu Valley—many more than in other archosaur hot spots.
It's unknown why the lush, wet valleys of prehistoric Tanzania produced so many strains of dinosaur relatives, paleontologists say.
Whatever the reason, "what we're seeing here is a picture of a world in transition," Brochu said. "It's really a fascinating time in the history of life."
© 1996-Sat Oct 01 01:01:35 EDT 2016 National Geographic Society. | 科技 |
2016-40/3982/en_head.json.gz/17974 | 1.7 Billion Contactless Tickets Set to Ship in 2018 as Countries Address Interoperability Issues and Competitive Barriers, Says ABI Research
ABI Research has forecast that a combination of memory and microcontroller smart cards alongside disposable ticketing solutions will reach shipments totaling 1.7 billion units in 2018. The primary drivers include the increasing move to national standards and the enablement of NFC and open-loop payments. This trend is a consistent feature across all continents with particular progress within the UK, U.S., Australia, Germany, Turkey, and the BRIC countries forming a growth engine for future contactless ticketing adoption.
Growth is particularly strong in China, India, and Brazil where accelerating contactless ticketing programs within flagship cities can be found. In 2013 these three countries accounted for 21% of all smart contactless ticketing cards shipped worldwide. As these projects evolve ABI Research expects to see further expansion across other cities and the enablement of open-loop and NFC acceptance.
Looking further forward over the next three years the introduction of tri-readers could open up market competition. The UK and Australia are two countries known to be introducing tri-readers into their transit infrastructure, a clear indication that transport authorities are moving ever closer to the acceptance of open-loop and NFC credentials. Tri-readers allow the reading of existing closed-loop protocols as well as those supported by ISO 14443.
Research Analyst Sealy comments, “I expect to see a more competitive market from both a protocol and solution acceptance perspective. Several countries are working to create multi-protocol acceptance solutions with transport authorities able to incorporate different types of payments and ticket-types within a single system. New and upgraded systems being introduced in countries such as the U.S. and China, where we are tracking the adoption of contactless bank cards, NFC, and mobile ticketing, will be particularly interesting. Combined, these new technologies could disrupt the current ecosystem, increasing competition and introducing new players to threaten the current status quo.”
These findings are part of ABI Research’s Transportation and Ticketing Technologies Research Service (https://www.abiresearch.com/research/service/transportation-ticketing-technologies/).
ABI Research provides in-depth analysis and quantitative forecasting of trends in global connectivity and other emerging technologies. From offices in North America, Europe and Asia, ABI Research’s worldwide team of experts advises thousands of decision makers through 70+ research and advisory services. Est. 1990. For more information visit www.abiresearch.com, or call +1.516.624.2500. | 科技 |
2016-40/3982/en_head.json.gz/18037 | Technology » ScienceAsteroid Hunters Want to Launch Private Telescope by Alicia ChangAssociated PressThursday Jun 28, 2012 PRINT
LOS ANGELES (AP) - Who will protect us from a killer asteroid? A team of ex-NASA astronauts and scientists thinks it's up to them.In a bold plan unveiled Thursday, the group wants to launch its own space telescope to spot and track small and mid-sized space rocks capable of wiping out a city or continent. With that information, they could sound early warnings if a rogue asteroid appeared headed toward our planet.So far, the idea from the B612 Foundation is on paper only.Such an effort would cost upward of several hundred million dollars, and the group plans to start fundraising. Behind the nonprofit are a space shuttle astronaut, Apollo 9 astronaut, former Mars czar, deep space mission manager along with other non-NASA types.Asteroids are leftovers from the formation of the solar system some 4.5 billion years ago. Most reside in the asteroid belt between Mars and Jupiter but some get nudged into Earth's neighborhood.NASA and a network of astronomers routinely scan the skies for these near-Earth objects. And they've found 90 percent of the biggest threats - asteroids at least two-thirds of a mile across that are considered major killers. Scientists believe it was a six-mile-wide asteroid that wiped out the dinosaurs.But the group thinks more attention should be paid to the estimated half a million smaller asteroids - similar in size to the one that exploded over Siberia in 1908 and leveled more than 800 square miles of forest."We know these objects are out there and we can do something to prevent them" from hitting Earth, said former Apollo 9 astronaut Rusty Schweickart, who helped establish the foundation a decade ago.Asteroids are getting attention lately. NASA nixed a return to the moon in favor of a manned landing on an asteroid. Last month, Planetary Resources Inc., a company founded by space entrepreneurs, announced plans to extract precious metals from asteroids within a decade.Since its birth, the Mountain View, Calif.-based B612 Foundation - named after the home asteroid of the Earth-visiting prince in Antoine de Saint-Exupery's "The Little Prince" - has focused on finding ways to deflect an incoming asteroid. Ideas studied include sending an intercepting spacecraft to aiming a nuclear bomb, but none have been tested.Last year, the group shifted focus to seek out asteroids with a telescope.It is working with Ball Aerospace & Technologies Corp., which has drawn up a preliminary telescope design. The contractor developed NASA's Kepler telescope that hunts for exoplanets and built the instruments aboard the Hubble Space Telescope.Under the proposal, the asteroid-hunting Sentinel Space Telescope will operate for at least 5 1/2 years. It will orbit around the sun, near the orbit of Venus, or between 30 million to 170 million miles away from Earth. Data will be beamed back through NASA's antenna network under a deal with the space agency.Launch is targeted for 2017 or 2018. The group is angling to fly aboard SpaceX's Falcon 9 rocket, which made history last month by lifting a cargo capsule to the International Space Station.Experts said the telescope's vantage point would allow it to spy asteroids faster than ground-based telescopes and accelerate new discoveries. NASA explored doing such a mission in the past but never moved forward because of the expense."It's always best to find these things quickly and track them. There might be one with our name on it," said Don Yeomans, who heads the Near-Earth Object Program at NASA's Jet Propulsion Laboratory, which monitors potentially dangerous space rocks.Aside from the technological challenges, the big question is whether philanthropists will open up their wallets to support the project.Nine years ago, the cost was estimated at $500 million, said Tim Spahr, director of the Minor Planet Center at Harvard University who was part of the team that came up with the figure for NASA.Spahr questions whether enough can be raised given the economy. "This is a hard time," he said.The group has received seed money - several hundreds of thousands of dollars - from venture capitalists and Silicon Valley outfits to create a team of experts. Foundation chairman Ed Lu said he was confident donors will step up and noted that some of the world's most powerful telescopes including the Lick and Palomar observatories in California were built with private money."We're not all about doom and gloom," said the former shuttle astronaut. "We're about opening up the solar system. We're talking about preserving life on this planet."Copyright Associated Press. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Comments
Related StoriesFormer UK Telescope to Be Removed from Hawaii Mountain SCIENCE | By Jennifer Sinco Kelleher | Oct 24The University of Hawaii announced Wednesday that a third observatory will be taken down from Mauna Kea - a move that fulfills the governor's request to remove 25 percent of telescopes from the mountain considered sacred by many Native Hawaiians.Smarter Siri, Better Battery Life in Apple Software UpdateApple's iPhones and iPads have gotten free software updates, including battery improvements and a smarter virtual assistant. The new features and capabilities in the update, iOS 9, are primarily refinements rather than anything transformative. More» INSIDE EDGE | 科技 |
2016-40/3982/en_head.json.gz/18054 | A Martian History Lesson
Early geologists used observations of the layers, or strata, in rocks and soil to devise a time sequence of Earth's geologic history. When orbiter missions began returning images from Mars, planetary scientists were able to develop a rudimentary timescale of Martian geologic history, based on the cratering density of various surfaces. An older surface is one that has accumulated more impact craters per unit area.
This THEMIS image shows a close up view of the plains in the Hesperia Planum, a region of surfaces that formed during Mars' Hesperian Era ("middle ages"). The surface shown in the image has a large number of craters that are 1-3 km (0.6-1.9 miles) in diameter, which indicates that the surface is very old and has been subjected to a long period of bombardment. (Image credit: NASA/JPL/Arizona State University)
The Noachian Era is the oldest period and lasted from Mars' beginning, about 4500 million years (MY) ago, to about 3500 MY ago. Climate on Noachian Mars was probably very different than it is today, and geologic features such as dried up river valleys and delta features suggest that the climate may have been warmer and wetter. Erosional processes and volcanic activity took place during this time, and many scientists believe lakes and oceans could have existed.
The Hesperian Era - or "Middle Ages" - lasted from about 3500 MY ago to 2500 MY ago. It was during this time period that Martian climate began to change to drier, dustier conditions. Water that flowed on the Martian surface during the Noachian Era may have frozen as underground ice deposits, and most river channels probably experienced their final flow episodes during this era.
The Amazonian Era began about 2500 MY ago and continues to the present. Although Amazonian Mars has been mostly dry and dusty, there are signs that water is sometimes released onto the surface from underground through sudden floods. Volcanic activity occurs, but at much lower levels than in Mars' earlier history.
This map shows the maximum extent of glacial ice in Earth's north polar area during the Pleistocene Epoch, or Great Ice Age, which began about two million years ago. (Image credit: U.S. Geological Survey)
The term "ice age" is frequently interpreted to mean a period in Earth's history when vast ice sheets extended into latitudes that, today, have a more temperate climate. In fact, an "ice age" more generally refers to a succession of alternating glaciations and interglaciations, spanning a total time period of 1 to 10 million years.
During a glaciation phase, ice sheets spread outward over many parts of the world, and a long-term decline in the Earth's temperature occurs. During the interglacial periods, the ice sheets recede (melt), and a warmer, more temperate climate prevails on Earth.
At least four major ice ages have occurred in Earth's geologic history, the earliest being about 600 to 800 million years ago. During the past 2.5 to 3 million years, the Earth has been in an ice age known simply as "The Ice Age." Even though Earth's current climate is considered mild, we are still in the ice age because the Greenland and Antarctic ice sheets still exist. Our present position is within an interglaciation period, which began quite rapidly about 15,000 years ago. During the preceding glaciation, called the Wisconsinan Glaciation, ice sheets covered the North American and European continents.
Climate Change on Earth | 科技 |
2016-40/3982/en_head.json.gz/18070 | Polar Field Services | Remote logistics and field support
We love challenges.
Why Polar Field Services?
Air, Land & Marine Transportation
Facilities Support Services
Field Camps
Remote Power Solutions & Renewable Energy
Risk Management & Field Training
Science Communications & Outreach
Science & Technical Services
Staffing & Consulting Services
Air Support
Field Notes Blog
About Field Notes
Collaborative research focuses on Alaskan fisheries, environment
Sitka, AK., is one of two commercial fishing communities featured in this comparative study on Dillingham and Sitka. Photo: Taylor Rees
Cultural anthropologists Karen Hébert (Yale University) and Danielle DiNovelli-Lang (Carleton University, Ottawa) recently began a three-year project funded by the National Science Foundation to investigate relationships and perspectives on climate, environment and economic crisis in two of Alaska’s rural subarctic communities: Dillingham and Sitka.
“We’re interested in how people form knowledge about environmental and economic risk in their communities,” says Hébert, the project’s principal investigator. “Where do they get information, how does it inform them, and what do they do with the information they acquire? How is knowledge developed, circulated, and received? How do ideas evolve and travel, and how are they communicated? How do different and often opposing viewpoints bring together or divide people on issues surrounding local resource development?”
The study aims to better understand how different communities address resource concerns and identify avenues for building more inclusive and effective efforts. Photo: Taylor Rees
The two coastal Alaskan communities at the center of the study are accessible only by air or water. Both have native and non-native populations, and commercial fishing is a main part of the economy.
The comparative study aims to better understand how different communities address resource concerns and identify avenues for building more inclusive and effective efforts. The researchers plan to share what they learn with coastal communities in Alaska and around the world that are facing similar challenges.
Dillingham, located in southwestern Alaska, north of the Aleutian Islands, is Bristol Bay’s largest population center. The Bay boasts some of the most extreme tides on earth—up to 30 feet—and the shallow waters host the world’s largest sockeye salmon fishery. These fisheries not only feed the local people, but also are the foundation of the region’s commercial fishery. The population doubles during the summer salmon fishing run with the influx of non-local workers.
The proposed Pebble Mine galvanized local opposition and increased the focus on issues surrounding the environment, natural resources, and mineral extraction. Photo: Karen Hébert
In the early 2000s mining companies began making plans to begin mining for gold, copper, and molybdenum in the area. One of these proposed mines is the Pebble Mine, which would be in the headwaters of rivers near Dillingham. Local communities began to worry about the impact of mining on the local fisheries, controversy brewed, and what followed was a greater conversation about natural resources management. In addition, mine opponents worried the mine would bring an unsustainable influx of workers and other people.
“The issues surrounding the Pebble Mine have brought together some otherwise opposing groups,” says Hébert. “Community members are largely motivated by preserving the health of their water and fisheries, but they sometimes have had differing ideas about how to do this.”
Groups unified to lobby the Environmental Protection Agency to declare Bristol Bay off limits to mine development under the Clean Water Act. The EPA recently issued a proposed measure that, if approved, would do just that. Now, many investors have abandoned the project. Hébert and her team aim to “capture and understand changing opinions and motivations.”
Sitka, located on an island in southeast Alaska, also relies on commercial fishing for its economy. Unlike Dillingham, Sitka fisheries employ a variety of methods and gear, and extend beyond salmon to herring, halibut and black cod. A slight majority of the city’s population identifies as non-native. And Sitka has a long history of producing scientific research.
“In Sitka, perhaps because of early colonial settlements, which set up missionary schools, the educational infrastructure is well-developed. The community is still known for its schools, so people have a different relationship to science. This may have informed how local NGOs advocate for community research and conservation,” says Hébert.
For instance, when the local pulp mill closed in 1993, some Sitka residents questioned the viability of their economic future, and argued that diversification was key. Since then, there has been a big push to deliberately concentrate on science and education so that the economy is not based solely on extractive resource uses. Many Sitkans hope that the community might evolve to become a regional center for research relevant to coastal Alaska.
Bristol Bay and its fisheries are an important component to the study. Photo: Karen Hébert
Hébert says it’s often tough to capture the wide range of perspectives these communities present.
“Environmental issues can be very polarizing,” she says. “We sometimes have to work hard to get people to talk to us who feel like they hold views that put them at odds with others in their community. Some people are very reticent despite the fact that interviews are confidential. Some issues create lots of disagreement and so it can be tough to do research that moves across socially disparate views. In both communities there are clear, local conflicts, as well as others just below the surface, such as issues surrounding subsistence hunting and fishing. There is always tension surrounding protection and access to resources.”
The team approach to this study uses methodologies such as participant-observation, which involves taking detailed field notes. Hébert’s team is also conducting semi-structured interviews to gain a better understanding of Dillingham and Sitka.
“We are attempting a team approach with two lead scientists and five graduate students,” she says. “We do many interviews together, which can lead to really different follow-up questions and more lively conversation. There are some huge advantages to this approach. It’s really exciting, if also very challenging.”
Hébert uses relatively unsophisticated technology for interviews—just a notebook and digital voice recorder. The exception is in a series of oral history interviews captured in high-quality audio and video that will be preserved and shared with the communities through the Sitka Maritime Heritage Society and through the University of Alaska at Fairbanks.
In coming years, Hébert’s team will continue their research, conducting interviews with people who are not necessarily community leaders or in resource management positions, hoping to capture the full range of viewpoints among those who use coastal resources.
“We are really interested to know what makes environmental issues polarizing as well as when and how they bring people in the communities together. How do ideas travel to align people or break them apart? How do people think about future resources and what informs and motivates them to confront their concerns?” –Marcy Davis
This entry was posted in Alaska, Arctic, National Science Foundation, Polar Field Services and tagged Alaska, Bristol Bay, Dillingham, fisheries, fishing, Karen Hebert, Marcy Davis, National Science Foundation, Pebble Mine, Sitka on October 22, 2014 by Rachel. Post navigation
← In the News
Studying Greenland’s Freshwater Lakes for Clues on Climate Change Impacts →
Recent Posts What the Humble Blade of Grass Can Tell Us about Environmental Change, Farming and Human Ecodynamics in Iceland
Endangered Archaeology: Climate Change Threatens to Swallow Paleo-Inuit Sites from Alaska to Greenland
GrIT Situation Report 8
Mapping Community Exposure to Coastal Hazards in Northern Alaska
September 2016 June 2016 May 2016 April 2016 March 2016 February 2016 January 2016 December 2015 October 2015 July 2015 June 2015 May 2015 April 2015 March 2015 February 2015 December 2014 November 2014 October 2014 September 2014 August 2014 July 2014 June 2014 May 2014 April 2014 March 2014 February 2014 January 2014 December 2013 November 2013 October 2013 September 2013 August 2013 July 2013 June 2013 May 2013 April 2013 March 2013 February 2013 January 2013 December 2012 November 2012 October 2012 September 2012 August 2012 July 2012 June 2012 May 2012 April 2012 March 2012 February 2012 January 2012 December 2011 November 2011 October 2011 September 2011 August 2011 July 2011 June 2011 May 2011 April 2011 March 2011 February 2011 January 2011 December 2010 November 2010 October 2010 September 2010 August 2010 July 2010 June 2010 May 2010 April 2010 March 2010 February 2010 January 2010 December 2009 November 2009 October 2009 September 2009 August 2009 July 2009 June 2009 May 2009 April 2009 Categories Alaska (235)
Arctic (292)
Atmosphere (18)
CH2M HILL Polar Services (170)
Cryosphere (59)
Fun & Games (7)
Geological Sciences (37)
Instrument Development (40)
Meteorology & Climate (122)
National Science Foundation (340)
Operations (21)
Outreach & Education (138)
Polar Field Services (306)
Social & Human Sciences (56)
Space Physics (6)
SRI International (8)
UMIAQ (3)
Alicia Clarke
Ed Stockard
Greenland ice cap
Greenland Inland Traverse
Marcy Davis
PFS 2.0
Phase II winter
polar careers
Polar Field Services
Polar logistics
polar photography
Polar TREC Outreach
Science Support Operations
Summit Camp
Summit Station
Thule Air Base
Toolik
Toolik Field Station
Tracy Dahl
winter turnover
About Field Notes Field Notes offers our readers a front-row seat in the theater of arctic research. Find out more about the blog and the people behind it »
CONNECT WITH US ©2016 Polar Field Services, Inc. All rights reserved. | 科技 |
2016-40/3982/en_head.json.gz/18073 | DEQ to review abandoned landfill for leaked toxins VW settlement to help improve Oregon air New tool enables people to see level of carcinogen in their local water supply New car-sharing service launches in Portland Gateway Green trail development on the way, if crowdfunding pays off My View: 25 years later, Web is still changing the world
Created on Thursday, 13 March 2014 08:00 | Written by Rich Bader | Tweet
Twenty five years! Wait, what? Its been 25 years since the advent of the World Wide Web, when dub-dub-dub and URLs entered our vocabulary. Without hyperbole, it is astounding how much impact the Web is having on the wide world. The Web has transformed so much of human endeavor. For many of us, each day is filled with satisfaction from our multiplicity of uses for the Web. Today, its difficult to separate the Web from the Internet, the two have become so synonymous. For some, the Web is the Internet. For millions of folks, the Web browser, with its point-and-click simplicity, in conjunction with a search engine like Google, is the on-ramp to the Internet and perhaps, with the exception of e-mail, all they will ever need. To put Web adoption in perspective: Back in 1995, only 14 percent of U.S. adults had Internet access, according to the Pew Research Centers first survey. Most of us up until then were using e-mail and other technologies like Gopher and WAIS to find information that others had published online. It took a good deal of geeky persistence to get your system set up, find a site (no Google back then!) and search it for what you wanted. Of course, it was text only and slower than molasses on dial-up connections, but it was better than nothing. Sir Tim Berners-Lee, the developer of the Webs underpinnings, combined existing technologies in a very clever way and gave it to the world, for free. He made it simple for us to find information, through the use of the www prefix appended to domain names. He embraced hyperlinking, and that made it easy to click on links and follow your nose for hours on end. And HTML (HyperText Markup Language) was so easy to use, virtually anyone could build a website and publish it for the world to see. These advantages of making it simple to publish and find information led to an explosion of websites from new and existing organizations and individuals, and a cycle of more users begetting more content. The Web has grown in scale and capabilities from those early days of hacking together a home page in a text editor. From the obnoxious blinking text grew typographical capabilities that enable the most sophisticated page layouts. Interactivity led to e-commerce and transactional sites that substitute for customer service agents with whom we seem to be on hold forever. And, it should be noted that no one company owns the Web or the underlying technology. Berners-Lee and the W3 Consortium provide fantastic stewardship of an incredibly valuable asset. Today, it is estimated that there are billions of Web pages on the Web, on almost 200 million active websites. Now more than 80 percent of Americans use computers at work, home, school or elsewhere. And 60 percent of American adults use smartphones. Together, almost 90 percent of American adults use the Internet, a remarkable adoption rate for technological services. What are all these folks doing? They are building stronger relationships with family and friends, or reaching millions of fans. While online communications is no substitute for face-to-face interactions, with friends and family scattered hither and yon, its a great way to stay connected. Even Facebook is just a giant website. For the curious, the student, the so-called knowledge worker, the Web is a godsend. Is there a question that cant be answered on the Web? I continue to be amazed at how generous individuals and organizations are about sharing their knowledge. Pessimistic about the state of the world? Consider the human endeavor of Wikipedia, arguably the Eighth Wonder of the World. And, of course, we can apply, check availability, buy, complain and otherwise interact with organizations large and small through the Web. While we may regret the loss of human interaction, the convenience and typical efficiency of Web interactions suits our hectic lifestyles. Where is the Web going? Numerous next big things are on the horizon. Access to big data huge analytical projects will advance human knowledge (think genetics or brain research) to the next level. The Internet of Things will put intelligence (and likely a Web server) into virtually every device from light bulbs to appliances with the goal of having our homes more conveniently suit our needs and save us money. And the Web will continue to create opportunities in health care services, education, media distribution and government, bringing access, knowledge, convenience and communications to virtually everything it touches. I can hardly wait to see what the next 25 years brings. Rich Bader retired in January as CEO and co-founder of Easy Street Online Services, Beaverton. His 35-year career in high tech included management of Intels Personal Computer Enhancement Operation from 1978 through 1990, then four years of business plan and product consulting to companies such as Intel, Hewlett-Packard and Microsoft before launching Easy Street in 1995. JW_DISQUS_VIEW_THE_DISCUSSION_THREAD | 科技 |
2016-40/3982/en_head.json.gz/18076 | "Rabinow has written an interesting book about the failed negotiations between a French genetics lab, the Centre d'Etude du Polymorphisme Humain (CEPH) and Millenium, an American biotech company that wanted its family DNA history on diabetes and obesity. This book is not about the science of molecular biology—it's a look at how the different ethics of France and America affect the way people and politicians feel about the sanctity of DNA."—Library Journal
An excerpt fromFrench DNATrouble in Purgatoryby Paul Rabinow Introduction
In Paris, during the winter and spring of 1994, what was alternately characterized as a quarrel, a dispute, a struggle, a debate, a battle, or a scandal simmered and then flared up to a white-hot intensity before dissipating, as such things tend to do in Paris, as a government commission was formed to study the matter and the summer vacations approached. Immediately at issue was a proposal to institute a formal commercial collaboration between an American start-up biotechnology company, Millennium Pharmaceuticals, Inc., and France's premier genomics laboratory, the Centre d'Etude du Polymorphisme Humain (CEPH). During the early 1990s, the CEPH, led by its dynamic scientific director, Daniel Cohen, had conceived of and implemented a highly innovative and effective strategy to map the human genome. Cohen was proud to announce in December 1993 that the CEPH had won the race to produce the first physical map of the human genome. When he crafted his victory announcement, with the substantial aid of a New York public relations firm, Cohen made special efforts not to humiliate the heavily government-subsidized American laboratories whom he had just beaten.
Among the leaders of the American genome mapping effort was a Massachusetts Institute of Technology (MIT) scientist named Eric Lander, a cofounder of Millennium. Daniel Cohen was also one of the cofounders of Millennium. Scientists from the CEPH had been discussing joint projects with scientists from Millennium throughout 1993. Scientists from the CEPH went to Cambridge to visit Millennium scientists and hear of their plans. The French government had been informed of, and approved, the idea of a commercial collaboration between the CEPH and Millennium. The core of the collaboration was to be a project to discover the genetic basis of non-insulin-dependent forms of diabetes. Diabetes is a major health problem in the affluent countries, and there is good reason to believe that insights about diabetes could well be applied to obesity. The public health implications are important. The potential market is extravagant. In order to identify genes that might be involved in these or other conditions, one needs as large a pool of families as possible. An examination of inheritance patterns of these families would facilitate the search for so-called candidate genes. Researchers at the CEPH had assembled a respectable collection of families, some of whose members suffered from non-insulin-dependent diabetes. Such collections are valuable, because they are costly and time-consuming to assemble. Millennium proposed collaborating with CEPH scientists on the basis of the CEPH's family material. In principle, CEPH scientists were interested in such a collaboration because Millennium was developing potentially rapid and powerful new technologies to identify genes (although it was cautious about entering into this new terrain). Finally, Millennium was well funded, and the 1990s was a period of budget cutbacks for French science. A team from Millennium went to the CEPH in February 1994 to finalize the agreement. Things came apart; confrontation, polemic, and confusion ensued. The French government moved to block the deal. The problem, explained the government spokesman, was that the CEPH was on the verge of giving away to the Americans that most precious of thingssomething never before named in such a mannerFrench DNA.
French DNA's narrative is structured around these events set against the background of contrastive developments in the United States (the AIDS epidemic, the biotechnology industry, the Human Genome Initiative), the great specter of a possible future. In the United States, during the latter half of the 1970s, intense debate had raged about the safety (as well as ethical and philosophical implications) of what was then referred to as "recombinant DNA." An unprecedented moratorium on research devised and shepherded by leaders of the scientific community succeeded in keeping government legislation at bay and basically allaying public fears focused on the safety issue. During the early 1980s, debate shifted to the status of scientific, commercial, and ethical relationships between university- and government-based research and the nascent biotechnology industry. By the end of the decade, in the United States, the landscape had been effectively reshaped; although debate and discussion continue, a large biotechnology industry funded by a massive infusion of venture capital and an equally significant amount of capital from large, often multinational pharmaceutical companies had become an established force. Millennium was not atypical of such companies; it was staffed by prestigious scientists and physicians with affiliations with Harvard and MIT and was initially funded by venture capital. From their perspective, considering an alliance with the CEPH seemed strategically astute and perfectly ordinary. They had interpreted the actions and statements of members of the CEPH during the months of preliminary contacts and negotiations as indicating that the French situation was changing in a similar fashion. In fact, this diagnosis was premature.
Arriving at the CEPH immediately after the announcement of their mapping victory, I was faced with the question of what should be the focus of my study. My entry was not a reenactment of the traditional ethnographic arrival scene on some exotic site. I was already fluent in the language, had previously lived in France for years, had just completed a complementary study of an American biotechnology company and the invention there of a powerful molecular tool, and had immersed myself in the debates and inquiries swirling around the Human Genome Initiative and its scientific, technological, ethical, legal, social, political, cultural, theological, and no doubt other dimensions. The allocation of a percentage (3-5%) of the American genome budget to social, ethical, and legal issues made it, in the words of one of its directors, "the largest ethics project in history." I was intrigued by the extravagance of this phenomenon. There had been a lot of talk of "the book of life," "the holy grail," and the like. In the early years, conferences held outnumbered genes localized. Because the genomic science itself had been successfully cordoned off from "ethical and social" scrutiny, such scrutiny was reserved only for "consequences."
At the CEPH, I soon decided that I would not concentrate on the CEPH's past triumphs. There were two reasons for this: first, I felt that there would be historians of science who would be better trained to do the archival work; second, I felt that the "genius" of the CEPH was its ability to make the next move in a manner that brought the elements into an innovative assemblage. Hence I decided to concentrate on the four research projects (aging, cancer, AIDS, parasite genomes) that Cohen had inserted (some would say imposed) into the margins of the CEPH. Introducing an anthropologist was a sort of fifth research project. I plunged first into familiarizing myself with the current molecular technology in use at the CEPH. Then the Millennium crisis happened. I kept both the experimental sites and Millennium balls in the air for the duration of my stay. Other factors intervened such that this book has taken the shape it has; that is to say, the disruption and its suite of consequences became the focus of my study.
French DNA's focus is on a singular instance of a multidimensional crisis in 1994. The elements of that crisis included the felt need to transform an extremely successful and innovative large-scale scientific and technological apparatus in the face of international competition; pressing claims that the work done at the lab (and its associated allies) was of the utmost consequence not only for the future of French science but for the future well-being of humanity; acute concerns, widespread in the French cultural/political milieu, over the legitimate range of experimentation in the biosciences expressed in a vocabulary of bioethics; ferocious conflict over potential means of financing the work; personalized confrontations between leading scientists over what it meant to be a scientist today, pitting against each other (at a more general level) contrastive modes of subjectivation of science as a vocation.
French DNA is about a heterogeneous zone where genomics, bioethics, patients groups, venture capital, nations, and the state meet. Such a common place, a practiced site, eruptive and changing yet strangely slack, is filled with talk of good and evil, illness and health, spirit and flesh. It is full of diverse machines and bodies, parts and wholes, exchanges and relays. For those mortally ill, or told they are so, all this discourse, all these diverse things, can produce a good deal of anxious waiting and solicitation. It can also produce a range of other effects and affects in the world. I became intrigued by the futures being carved out of the present. Their representations ranged from ones full of dangers to others of a potential luminosity. Today, as yesterday, partisans of both visions abound. Partisans that they are, they find their antagonists' arrogance, misplaced emphases, failures of nerve, and sheer blindness trying. Amid all the discord, however, all parties agree that the future is at stake and that there is a pressing obligation to do something about it.
Copyright notice: Excerpted from pages 1-5 of French DNA by Paul Rabinow, published by the University of Chicago Press. ©1999 by the University of Chicago. All rights reserved. This text may be used and shared in accordance with the fair-use provisions of U.S. copyright law, and it may be archived and redistributed in electronic form, provided that this entire notice, including copyright information, is carried and provided that the University of Chicago Press is notified and no fee is charged for access. Archiving, redistribution, or republication of this text on other terms, in any medium, requires the consent of the University of Chicago Press.
Paul Rabinow
French DNA: Trouble in Purgatory
©1999, 210 pages
Cloth $25.00 ISBN: 978-0-226-70150-9
Paper $20.00 ISBN: 978-0-226-70151-6 For information on purchasing the bookfrom bookstores or here onlineplease go to the webpage for French DNA.
Our catalog of biology titles
Our catalog of anthropology titles
Other excerpts and online essays from University of Chicago Press titles.
Sign up for e-mail notification of new books in this and other subjects | 科技 |
2016-40/3982/en_head.json.gz/18157 | Diminishing Returns: The Cold, Hard Truth for CES Smartphones
This is the bitter reality for Android phone makers right now. The improvements in the latest, most premium phones aren't really that big of a deal. By Jared Newman @OneJaredNewmanJan. 09, 2013 Share
Sony Xperia Z Email
Sony has a new flagship phone at CES called the Xperia Z, and it is pretty decent. The company has another one, called the Xperia ZL, that’s also fairly nice. They’re both roughly as okay as the flagship Ascend D2 phone that Huawei was showing off in another part of the room.
If my descriptors seem lacking, that’s intentional. The truth is that none of these phones seem markedly better than last year’s holiday handsets. They’ve got a few perks that some older phones don’t, like 1080p displays and quad-core processors, but in real world use it’s hard to see the added benefit.
When I sampled another new phone this morning, Pantech’s Discover, the experience seemed practically as solid as Sony’s and Huawei’s flagship devices, despite a mere 720p display and a dual-core chip. The biggest difference is that Pantech’s phone will sell for $50 on AT&T, starting this Friday. Huawei’s and Sony’s phones will likely be more expensive if they ever reach the United States.
This is the bitter reality for Android phone makers right now. The improvements in the latest, most premium phones aren’t really that big of a deal. A 1080p display doesn’t look much different from a 720p display at normal viewing distances. A quad-core processor doesn’t provide much of a real-world benefit over a dual-core one. Photos from a 13-megapixel camera don’t look significantly better than photos from an 8-megapixel one, and shutter lag on most good smartphone cameras dropped to near zero a year ago.
As a result my brain feels a bit mushy as I look at the latest phones from CES. In the context of quick hands-on demos, there aren’t a lot of remarkable things to relay about the cream of the crop. At a glance, they’re all just pretty good phones. (Okay, let’s give the Xperia Z credit for one cool trick: It can survive up to 30 minutes dunked in water.)
Jared Newman / TIME.com
Google is partly to thank–or to blame–for this situation. Ever since Android 4.0, known as Ice Cream Sandwich, Android phones have become a lot smoother and more polished. It’s now hard to tell the difference between a phone with a top-of-the-line processor and one with the next-best thing. Software, for that matter, tends to be the source of most innovation in smartphones nowadays, and while Android phone makers tend to add some of their own software bells and whistles to their phones, I’ve yet to see any innovations from them on par with, say, Google Now.
This isn’t only the case with Android. As I wrote last March, diminishing returns seems to have hit Apple’s latest products as well. In general, spec boosts in mobile devices just don’t have the same wow factor as they did a couple years ago, when an increase in display resolution or processing power produced noticeable differences to the average user.
There’s a chance that some real innovation will happen at Mobile World Congress next month, where big shots like Samsung, LG and HTC may announce new phones. But I’m willing to wager the story will be a lot like it is here: A bunch of incremental improvements in tech specs that don’t make much of a difference. That’s not so tragic–there are worse things, after all, than a really solid smartphone with no defining traits. It’s just kind of boring.
MORE:Check out TIME Tech’s complete coverage of the 2013 Consumer Electronics Show | 科技 |
2016-40/3982/en_head.json.gz/18185 | LocalMind: Quora meets FourSquare [invites]
by Brodie Beta
LocalMind is a check-in service that helps you find out what’s happening in a specific locations, by allowing you to ask questions to users who’ve “checked-in”.
It works like this: Users are able to locate a spot they want to know more information about and if another user has “checked-in”, you’re able to ask them anything you like. Similar to Foursquare, users can become experts of a specific location and will additionally earn “karma” points for helping answer the communities questions.
LocalMind is currently only available through the web with an invite code but there’s a mobile version for iOS on the way soon, hopefully arriving at some point next week according to the founder.
A Chat with the Founder
We had a chance to speak with the founder of the Montreal-based startup LocalMind, Lenny Rachitsky. He shared with us that he recently moved to Canada from San Diego, inspired by a visit he had in Montreal back in September.
Curious about how users are using the service, we asked Rachitsky to explain what types of questions are being asked:
It’s things like “how crowded is the bar?”or “is there line to get in to get into the club ?” or “is the WiFi up? or “what are the drink specials”
Can you bring us through how the Karma points feature works?
It’s something that we’re still playing with and experimenting with but the basic idea is: LocalMind is going to work if people are willing to help each other and help other people and answer questions. Initially it wasn’t really going to be the focus but, after doing a bunch of interviews and experimenting initially that seemed to be what excited people about it. Just the idea that they could help and they were happy to help and so we added the karama system.
Rachitsky continued to explain that the point would also be based off of location:
When you do anything good for other people or for the community you earn karma points. The karma points are associated with specific locations so if you answer a questions at a specific bar, you earn points for that bar and once you’ve got enough karma points, you level up through various levels of expertise. You start off as a LocalMind recruit then you become a LocalMind expert and then a champion and then legend.
Is the karma system similar to becoming the Mayor of a location on FourSquare?
It’s similar but there can be any number of LocalMind experts at a specific venue…the idea is, right now you send questions to specific places and initially it was only people there right now. With the karma system, once we recognize you’re an expert about a specific venue, we can also send you questions about that place because you probably know what’s happening and what’s ‘good to get there’ or ‘what the specials are’. So, we recognize you as an expert and you become available to answer questions about the specific place.
The web version of LocalMind currently works by sending users text messages but as Rachitsky tells us, it’s something they’d rather not do, and the new iteration of his service is best used with mobile.
The whole use case is mobile, so the future of the platform is going to be mobile and there’s not going to be a ton, short term on the web side of it. The reason for that is because, when you’re going out & about, that’s when you have questions like this and we’d rather not send you text messages which is how it works without the iPhone app. We’d rather push notifications to your phone.
What new features can we expect to see in the mobile version?
The big one is you can see, where ever you are, activity happening around you and questions and answers that have happened there previously.. it feels like one of the most valuable parts of the service because while you’re walking around the city, you can see questions hanging in the air around that area that other people have answered. So, when you’re not sure what do in an area you pull it up and see what people are saying is a good place to go to.
Can users follow and friend each other?
It’s a complicated component of this. The key so far has been to keep it anonymous because you’re able to see where LocalMinds are, and if we tell you who they are, that becomes really scary. We’re very serious about the privacy of you at a specific location so it’s all anonymous as a general rule. But, you do see where your friends are if they’re your friends on Facebook, FourSquare or Gowalla. And, you also see a leaderboard of how you’re doing compared to your friends.
Are there any additional features we can look forward to?
We’re working on a bunch of stuff around getting people engaged..we’re starting to develop a reputation score for every user, and with that we can determine who should get questions, who’s the expert at various venues and we can start creating public profiles of ‘who’s the expert in Toronto, Ottawa and Montreal or certain genres of questions in knowledge. We’re also starting to focus on continuing conversations. That’s something that comes up a lot. You ask a question, you get an answer and send a thank you but people want to continue the conversation and turn it into a chat to get more information or meet that person. So, that’s something we’re definitely exploring.
Beta Invites and Sneak Peek of the iPhone app
LocalMind has provided us with invites for the web as well as a sneak peek of the new iPhone application. As you can see in the images below, the LocalMind app enables users to ask questions from the map, answer questions or view the activity stream by distance in a list. Please let us know what you think. Do you think you’d use LocalMind?
Beta invite: TNW
Could this finally be the perfect iPad accessory? Share on Facebook (8)
Brodie Beta
Brodie Beta is a technology enthusiast with a passion for gadgets, media and anything related to the Web. She has worked in communications and media for the past nine years. Follow her on twitter here . Contact
3.62 All posts by Brodie >
The latest articles around Foursquare
Foursquare's prediction about Chipotle's sales drop is right on the money
Foursquare's Swarm is now a handy life-logging app
Martin Bryant
Foursquare adds deep-linking integration with Delivery.com | 科技 |
2016-40/3982/en_head.json.gz/18221 | Exclusive: U.S. nuclear lab removes Chinese tech over security fears
Mon Jan 7, 2013 | 8:32pm GMT
A man walks past a Huawei company logo outside the entrance of a Huawei office in Wuhan, Hubei province October 9, 2012. REUTERS/Stringer
By Steve Stecklow
| LONDON LONDON A leading U.S. nuclear weapons laboratory recently discovered its computer systems contained some Chinese-made network switches and replaced at least two components because of national security concerns, a document shows.A letter from the Los Alamos National Laboratory in New Mexico, dated November 5, 2012, states that the research facility had installed devices made by H3C Technologies Co, based in Hangzhou, China, according to a copy seen by Reuters. H3C began as a joint venture between China's Huawei Technologies Co and 3Com Corp, a U.S. tech firm, and was once called Huawei-3Com. Hewlett Packard Co acquired the firm in 2010.The discovery raises questions about procurement practices by U.S. departments responsible for national security. The U.S. government and Congress have raised concerns about Huawei and its alleged ties to the Chinese military and government. The company, the world's second-largest telecommunications equipment maker, denies its products pose any security risk or that the Chinese military influences its business.Switches are used to manage data traffic on computer networks. The exact number of Chinese-made switches installed at Los Alamos, how or when they were acquired, and whether they were placed in sensitive systems or pose any security risks, remains unclear. The laboratory - where the first atomic bomb was designed - is responsible for maintaining America's arsenal of nuclear weapons.A spokesman for the Los Alamos lab referred enquiries to the Department of Energy's National Nuclear Security Administration, or NNSA, which declined to comment.The November 5 letter seen by Reuters was written by the acting chief information officer at the Los Alamos lab and addressed to the NNSA's assistant manager for safeguards and security. It states that in October a network engineer at the lab - who the letter does not identify - alerted officials that H3C devices "were beginning to be installed in" its networks.The letter says a working group of specialists, some from the lab's counter intelligence unit, began investigating, "focusing on sensitive networks." The lab "determined that a small number of the devices installed in one network were H3C devices. Two devices used in isolated cases were promptly replaced," the letter states.
The letter suggests other H3C devices may still be installed. It states that the lab was investigating "replacing any remaining H3C network switch devices as quickly as possible," including "older switches" in "both sensitive and unclassified networks as part of the normal life-cycle maintenance effort." The letter adds that the lab was conducting a formal assessment to determine "any potential risk associated with any H3C devices that may remain in service until replacements can be obtained.""We would like to emphasize that (Los Alamos) has taken this issue seriously, and implemented expeditious and proactive steps to address it," the letter states.Corporate filings show Huawei sold its stake in H3C to 3Com in 2007. Nevertheless, H3C's website still describes Huawei as one of its "global strategic partners" and states it is working with it "to deliver advanced, cost-efficient and environmental-friendly products."
RECKLESS BLACKBALLING?
The Los Alamos letter appears to have been written in response to a request last year by the House Armed Services Committee for the Department of Energy (DoE) to report on any "supply chain risks."In its request, the committee said it was concerned by a Government Accountability Office report last year that found a number of national security-related departments had not taken appropriate measures to guard against risks posed by their computer-equipment suppliers. The report said federal agencies are not required to track whether any of their telecoms networks contain foreign-developed products.The Armed Services committee specifically asked the DoE to evaluate whether it, or any of its major contractors, were using technology produced by Huawei or ZTE Corp, another Chinese telecoms equipment maker. ZTE Corp denies its products pose any security risk.
In 2008, Huawei and private equity firm Bain Capital were forced to give up their bid for 3Com after a U.S. panel rejected the deal because of national security concerns. Three years later, Huawei abandoned its acquisition of some assets from U.S. server technology firm 3Leaf, bowing to pressure from the Committee on Foreign Investment in the United States. The committee evaluates whether foreign control of a U.S. business poses national security risks.In October, the House Intelligence Committee issued an investigative report that recommended U.S. government systems should not include Huawei or ZTE components. The report said that based on classified and unclassified information, Huawei and ZTE "cannot be trusted to be free of foreign state influence" and pose "a security threat to the United States and to our systems."William Plummer, Huawei's vice president of external affairs in Washington, said in an email to Reuters: "There has never been a shred of substantive proof that Huawei gear is any less secure than that of our competitors, all of which rely on common global standards, supply chains, coding and manufacturing."Blackballing legitimate multinationals based on country of origin is reckless, both in terms of fostering a dangerously false sense of cyber-security and in threatening the free and fair global trading system that the U.S. has championed for the last 60-plus years."He referred questions about H3C products to Hewlett Packard. An HP spokesman said Huawei no longer designs any H3C hardware and that the company "became independent operationally ... from Huawei" several years prior to HP's acquisition of it. He added that HP's networking division "has considerable resources dedicated to compliance with all legal and regulatory requirements involving system security, global trade and customer privacy."(Reporting by Steve Stecklow; Editing by Richard Woods) ([email protected])
Next In Technology News Yahoo hack may become test case for SEC data breach disclosure rules | 科技 |
2016-40/3982/en_head.json.gz/18240 | Ding dong: Richard Branson leads $28M funding round into smart doorbell startup Ring
Paul Sawers August 19, 2015 4:29 AM
Tags: DoorBot, Richard Branson, Ring Above: Richard Branson You know you’re onto something when big-name entrepreneurs such as Richard Branson come knockin’ on your door with their checkbook in hand. And that’s exactly what’s happened to Ring, a startup that’s invented smart doorbell technology for your house.
The Santa Monica-based company has announced a $28 million Series B round led by Branson, Shea Ventures, and American Family Insurance, and follows on from its $4.5 million Series A round last December.
We first covered Ring way back in 2013 when it was known as Doorbot, which was essentially a doorbell with a Wi-Fi-connected camera that reveals a live video of who is at your door.
Since then, the company has rebranded and further developed the technology, with big-name retailers such as Best Buy, Home Depot, Brookstone, Target, and Amazon all selling the Ring contraption.
Ring’s video doorbell calls a user on their phone when it’s activated, which is particularly useful for when they’re away from home — so even if it’s an honest visitor, they can still converse with them from a remote location. But in terms of burglars, well, it seems a common technique is to ring a doorbell or knock to see if anyone’s at home. With Ring, someone is always “at home.”
Ring also recently launched Ring Chime, a small speaker that “chimes” when the Ring Doorbell is activated. And it has introduced motion-detection and a cloud-based video storage service, which the company says has already been used to help police identify burglars.
Richard Branson is no stranger to tech startups, having participated in a $30 million funding round of e-taxi startup Hailo back in 2013. But doorbells is a first. “What excites me about Ring is its efficient, convenient approach to crime prevention and home monitoring and also its entrepreneurial leadership team,” says Branson. “I’m speaking as both an investor and a very happy customer.”
Though the Branson brand is a major scoop for Ring, the other lead investors are notable, too. Shea Ventures, for example, is the venture capital (VC) offshoot of J.F. Shea Co., which operates Shea Homes — a major homebuilding company in the U.S. And American Family Insurance, too, will have a vested interest in Ring, potentially offering the service to its customers to reduce home insurance rates.
“These new investors are all leaders in their industries, provide a strategic benefit to Ring, and share our mission of reducing crime in communities,” says James Siminoff, CEO of Ring. | 科技 |
2016-40/3982/en_head.json.gz/18348 | HomeAbout AgilentNewsroomPress Releases
Agilent Technologies Introduces Next-Generation Sequencing Cancer Research Panels
High-Performance ClearSeq AML to Be Followed by Series of Focused, Comprehensive Gene Panels for Cancer Research
SANTA CLARA, Calif., Sept. 2, 2014 Agilent Technologies Inc. (NYSE: A) today introduced ClearSeq AML, the first product in the ClearSeq line of next-generation cancer research panels, which targets 48 selected exons in 20 of the most commonly mutated genes found in acute myeloid leukemia. Designed in collaboration with Dr. Robert Ohgami and Dr. Daniel Arber at the Stanford Department of Pathology, Stanford University, ClearSeq AML provides comprehensive coverage of mutations and enables swift progression of sample processing to analysis in less than 48 hours. The AML gene panel will be followed by the release of additional ClearSeq panels for cancer research throughout the coming months.
"We are pleased to introduce the ClearSeq family of products, a comprehensive set of targeted NGS gene panels for cancer research," said Jacob Thaysen, vice president and general manager of Agilent's Diagnostics and Genomics Group. "ClearSeq NGS panels are expert-identified products that will provide customers with a complete end-to-end anatomical-to-molecular genomics portfolio that includes solutions ranging from IHC for cancer diagnosis to NGS for cancer research."
Acute myeloid leukemia (AML) is the most common myeloid neoplasm affecting adults, and the role of chromosomal structural variations in its molecular pathogenesis is well documented. In recent years, next-generation sequencing has led to a revolution in the study of hematological malignancies and shown that insertions, deletions (indels) and mutations play an essential part in the pathogenesis of AML. Genetic information coupled with standard anatomical findings has provided a deeper characterization and classification of AML.
ClearSeq AML provides 99.9 percent design coverage of targeted coding exons. Agilent's high-performance cancer panels enable the study of more complex genomic alterations in cancer, and deliver quick and accurate identification of relevant information. ClearSeq panels are based on SureSelect and HaloPlex technology, offering superior sensitivity and accuracy compared with other hybridization or PCR-based methods, thus minimizing the risk of false-positive results. ClearSeq panels can be easily incorporated into routine laboratory workflows. Sequence data analysis and mutation reporting can be completed in three simple steps using Agilent's SureCall software.
To learn more about how to incorporate this powerful set of panels into your research protocol, visit www.agilent.com/genomics/ClearSeqAML.
About Agilent Technologies
Agilent Technologies Inc. (NYSE: A) is a leader in chemical analysis, life sciences, diagnostics, electronics and communications. The company's 20,600 employees serve customers in more than 100 countries. Agilent had revenues of $6.8 billion in fiscal 2013. Information about Agilent is available at www.agilent.com.
In September 2013, Agilent announced plans to separate into two publicly traded companies through a tax-free spinoff of its electronic measurement business. On Aug. 1, 2014, the company's electronic measurement business began operating as Keysight Technologies, Inc., a wholly owned subsidiary. The separation is expected to be completed in early November 2014.
This news release contains forward-looking statements as defined in the Securities Exchange Act of 1934 and is subject to the safe harbors created therein. The forward-looking statements contained herein include, but are not limited to, information regarding the separation of Agilent's electronic measurement business; future revenues, earnings and profitability; the future demand for the company's products and services; and customer expectations. These forward-looking statements involve risks and uncertainties that could cause Agilent's results to differ materially from management's current expectations. Such risks and uncertainties include, but are not limited to, unforeseen changes in the strength of our customers' businesses; unforeseen changes in the demand for current and new products, technologies, and services; customer purchasing decisions and timing, and the risk that we are not able to realize the savings expected from integration and restructuring activities.
In addition, other risks that Agilent faces include those detailed in Agilent's filings with the Securities and Exchange Commission, including our latest Form 10-K and Form 10-Q. Forward-looking statements are based on the beliefs and assumptions of Agilent's management and on currently available information. Agilent undertakes no responsibility to publicly update or revise any forward-looking statement.
Susan Berg
[email protected]
Market Backgrounders
Dako Newsroom
Agilent Media Coverage
Agilent Blog
Agilent Twitter
Agilent YouTube
Agilent Social Hub
Press Release Resources:
Life Sciences/Chemical
Business Newsrooms
Life Sciences, Diagnostics & Applied Markets
Dako Press Releases | 科技 |
2016-40/3982/en_head.json.gz/18385 | FEATURE ARTICLEPlenty of Room at the Bottom?Tiny animals solve problems of housing and maintaining oversized brains, shedding new light on nervous-system evolutionWilliam G. Eberhard, William T. WcisloA basic fact of life is that the size of an animal’s brain depends to some extent on its body size. A long history of studies of vertebrate animals has demonstrated that the relationship between brain and body mass follows a power-law function. Smaller individuals have relatively larger brains for their body sizes. This scaling relationship was popularized as Haller’s Rule by German evolutionary biologist Bernhard Rensch in 1948, in honor of Albrecht von Haller, who first noticed the relationship nearly 250 years ago. Little has been known, however, about relative brain size for invertebrates such as insects, spiders and nematodes, even though they are among Earth’s more diverse and abundant animal groups. But a recent wave of studies of invertebrates confirms that Haller’s Rule applies to them as well, and that it extends to much smaller body sizes than previously thought. These tiny animals have been able to substantially shift their allometric lines—that is, the relationship between their brain size and their overall body size—from those of vertebrates and other invertebrates. Animals that follow a given allometric line belong to the same grade and changes from one grade to another are known as grade shifts. The result is that different taxonomic groups have different, variant, versions of Haller’s Rule. The mechanisms that are responsible for grade shifts are only beginning to be understood. But this combination of generality and variability in Haller’s Rule appears to call into question some basic assumptions regarding the uniformity of how the central nervous system functions among animals. It also reveals a number of overlooked design challenges faced by tiny organisms. Because neural tissue is metabolically expensive, minute animals must pay relatively higher metabolic costs to power their proportionally larger brains, and they thus face different ecological challenges. There is reason to expect that tiny animals might cut corners wherever possible, for example by adopting lifestyles that are behaviorally less demanding. Yet available evidence indicates that at least some small-bodied animals express the same kinds of behavior as their large-bodied relatives.Biologists have tended to ignore the lower limits of body size and the physiological processes that are associated with evolutionary decreases in brain mass. Instead they have focused on evolutionary increases in brain size, and its possible links to intelligence and other mental processes. And almost all of the current data have come from adults. But problems associated with the demands of a relatively large nervous system in a small animal are not limited to taxa with miniaturized adults. Many species have extremely small immature stages that are free-living, and whose growth and survival depends on their behavioral capabilities. The new data on invertebrate brain allometry have several important implications. They challenge vertebrate-based hypotheses that were proposed to explain Haller’s Rule that invoked factors such as surface-volume relations, longevity and metabolic rates. They also challenge the idea, again derived from studies of vertebrates, that animals with relatively and absolutely larger brains have more sophisticated behavioral abilities and mental capabilities. For these reasons, the time is ripe for exploring the ways that central nervous systems are organized at very small scales, within constraints that differ from those of large-bodied organisms.12345NEXT›»View All
Of Possible InterestFeature Article: The Penguin's Palette: More Than Black and WhitePerspective: Taking the Long View on Sexism in ScienceEngineering: From Lowly Paper Clips to Towering Suspension Bridges Other Related LinksFlybrain, the online atlas and database of the Drosophila nervous systemWormbase, the biology and genome of Caenorhabditis elegansA virtual atlas of the honey bee (Apis mellifera) brain Foreign-Language PDFsSpanish | 科技 |
2016-40/3982/en_head.json.gz/18394 | Galaxy Note 7 recall info | Shop Chromebooks: Asus Flip | Acer 14 | Dell 13 Google Glass
Updated: 9 months ago
As controversial as it is revolutionary, Google Glass aims to change the way we see and use mobile computing.
Google Glass was announced at the Google I/O developer conference in the summer of 2012. The first wearable Android device wasn't available until some time later, but that's when we got our first look at what the "Explorers" would have in store for them.
At its essence, Glass is a spectacles-type device that has a small display that sits in front of and above your right eye, the equivalent of a 25-inch high-definition display from 8 feet. It's got a 5-megapixel camera beside it. And there's a bone-conducting speaker for providing audio. The second version of the Glass prototype comes with a mono earbud.
Glass is powered by Android but runs a simple, card-based user interface. Glass apps are referred to as Glassware. Data comes directly over Wifi, or via your a Bluetooth connection on your phone when you're out and about. You typically can get no more than a day's use before recharging.
Glass is about the journey. It's about figuring out what to do with this tiny screen and camera on your face. Thousands of Explorers wear Glass every day. More are coming. | 科技 |
2016-40/3982/en_head.json.gz/18396 | ASU Planetarium Show Looks for Life on Other Planets
Rambouillet Magazine
Questions about the possibility of life on other planets will be explored during Angelo State University’s summer star show “In Search of New Worlds,” beginning Thursday, July 10, in the ASU Planetarium.
Show times are 8 p.m. Thursdays, July 10-31. The Planetarium is located in the ASU Vincent Nursing-Physical Science Building, 2333 Vanderventer.
Does the universe abound with countless planets teeming with life? Are we on the edge of a new era, soon to discover life elsewhere in the universe? Will at least one of those countless other “new” planets provide some hint that we share the vastness of space with other intelligent life or are we alone in the universe? These questions have preoccupied scientists and philosophers for centuries and are often asked by people today as they peer up at the stars.
As “In Search of New Worlds” explains, the elusive and apparently non-existent Planet X teased astronomers and astrophysicists with a display of gravitational pull that held promise for a 10th planet in our solar system. Never seen and never confirmed by exhaustive search methods, Planet X fostered growing speculation that no new planets of significance were likely to be found within our solar system.
However, the search for other worlds in other star systems has already reaped rewards as over 100 planets have been discovered beyond our solar system. “In Search of New Worlds” presents a comprehensive look at the search for extra solar planets. Using special effects and computer animation, the show takes the audience on an amazing sight-and-sound journey to the limits of our solar system and far beyond.
“In Search of New Worlds” is open to the public with admission prices of $3 for adults and $2 for children, students and senior citizens. ASU students, faculty and staff are admitted free.
For more information, call 942-2136 or visit the ASU Planetarium Web site at http://www.angelo.edu/dept/physics/planetarium.html. | 科技 |
2016-40/3982/en_head.json.gz/18397 | Home Television First Teen TV Network Launched First Teen TV Network Launched
Ryan Ball
Launched today, Varsity TV (VTV) lays claim to being “the world’s first and only teen TV network.” Featuring animated projects and other programming targeted specifically at the teen market, the new broadcast entity aims to bridge the youth-programming gap between Nickelodeon and MTV.
The 24-hour network was co-founded by Joe Shults, a member of the executive team that launched MTV, Nickelodeon and VH1. He was also one of the original management members for E! Entertainment. Co-founder Kelly Hoffman has more than two decades of financial and administrative experience in building start-ups into profitable companies.
In addition to series and films acquired from ABC International, CBC, CTV, Granada, RDF International, and Universal Television, VTV will air teen-created programming. The original series Animation Nation will feature everything from 30 second shorts to feature-length cartoons from teen animators.
Another hightlight for animation fans is the stop motion comedy series Lary Labrat, which plunges a sewer rat into the challenging world of lab test subjects. The show also blends in live-action elements.
VTV debuted on Galaxy 11, Transponder 13 and has now been added to Pod 14 of Comcast’s Headend in the Sky (HITS), whose affiliates service nearly 7 million U.S. households. In addition, a letter of intent has been signed with the National Cable Television Coop, which has more than 14.5 million basic subscribers.
VTV has built a loyal and active audience through its website, www.MyVTV.com. The privately held company with corporate headquarters in Austin, Texas is currently in final distribution negotiations with some of the country’s largest Multiple System Operators. | 科技 |
2016-40/3982/en_head.json.gz/18398 | News Man Arrested for Uploading Kaizoku Sentai Gokaiger via Share Program
On January 11, police in Aichi Prefecture arrested a 26-year-old male company worker from the prefecture's Toyohashi City on suspicion of using the Share file-sharing software to upload the live-action special effects show Kaizoku Sentai Gokaiger online without the permission of Toei, the copyright holder. According to the Association of Copyright for Computer Software (ACCS), the suspect uploaded the series' 26th episode around September 16, 2011.
According to the police investigation, the suspect also uploaded obscene animation and drawings. The man said under deposition that he had been uploading anime for about four years because he wanted to make these titles available to people for free.
Kaizoku Sentai Gokaiger is Toei's 35th Super Sentai (Power Rangers) television series, and began airing in Japan on February 13, 2011.
Source: Association of Copyright for Computer Software via Anime! Anime! Biz
served by moeka-chancloudflare ray# 2eb0824587b20862-IAD | 科技 |
2016-40/3982/en_head.json.gz/18426 | Bodies In the Space Environment (BISE) - Which way is up in space?
Bodies in the Space Environment (BISE) is an ongoing experiment that studies how astronauts distinguish up from down in a near-weightless environment. Sponsored by the Canadian Space Agency, BISE is a York University experiment led by Principal Investigator Dr. Laurence Harris. Canadian Space Agency Astronaut Bob Thirsk is one of 6 astronaut subjects for the experiment. Scientists will analyse data collected before, during, and after flight. BISE sessions on the International Space Station will be conducted throughout September 2009 and well into 2010.
For most people, telling up from down is so easy they hardly give it a second thought. But some people—those who find themselves in unusual, extreme environments or who suffer from certain medical disorders—find orienting themselves difficult and, as a result, they can make life-threatening mistakes.
One extreme environment where this can be a problem is space, where the brain no longer has gravity to help determine up and down. Astronauts must rely on other cues, such as their body position and what they see around them. If they get disoriented, it can lead to errors like flipping switches the wrong way or moving in the wrong direction during an emergency.
Principal Investigator Dr. Laurence Harris of York University. (Credit: CSA)
To learn more about how astronauts perceive up and down in microgravity, the Canadian Space Agency (CSA) is sponsoring a study called BISE (Bodies in the Space Environment) being led by a group of scientists from York University. It was partially conducted by Canadian astronaut Dr. Robert (Bob) Thirsk during Canada’s Expedition 20/21. Launched in May 2009, this mission marked a milestone of Canada’s Manned Space Program as Thirsk took part in the first-ever long-duration mission and research flight to the International Space Station (ISS). The tests involved having subjects—including Thirsk—view a computer screen through a cylinder that blocks all other visual information. The astronauts were presented with background images with different orientations relative to their bodies. On top of these images were superimposed a letter that could be either a "p" or a "d" depending on its orientation.
They had to indicate which letter they saw and the scientists measured the transition points where the letters changed from a "p" to a "d" and back again. "The angle between those two is taken as the perceptual upright," notes psychologist Laurence Harris, principal investigator of the experiment. "We're able to alter that perceptual upright by changing body orientation or visual orientation." (In earth-based tests, they also change the relative position of gravity by having subjects lay on their sides.)
The researchers want to determine the relative importance of visual and body cues to the subjects' perception of up. Said Harris, "How the brain combines multiple pieces of information about the same thing and comes up with the right answer is a key question."
Studies done in special aircraft that produce brief periods of microgravity suggest that, in the absence of gravity, people rely more on body cues than on vision to tell them which way is up. The study will examine whether this also happens on the Space Station.
BISE experiment undergoes testing during parabolic flights in France. (Credit: Dr. Laurence Harris, York University)
According to Harris, their findings can help to create a safer work environment in space. He noted that the Station is "tricky" because its modules are not all in a straight line. "You often go through a right angle to go between one module and another, so whatever corresponds to the ground in one module won't necessarily match in another module."
This can cause disorientation that could have serious consequences in an emergency evacuation. Space station modules do have exit signs but "if you want to guid e people, you have to know what cues they're using and how effective they are—how effective a visual clue can be in that situation," Harris said.
Luchino Cohen, the CSA's mission scientist for the BISE experiment, noted that these issues also affect astronauts during spacewalks. "When you get outside the spacecraft, you have to adjust your orientation and use whatever visual cues you have." He said CSA supports this research because its mandate includes using the space environment to "improve the safety of space travel by understanding how the human brain adapts to microgravity."
According to Harris, the tools developed for this experiment can also help people on Earth who experience balancing problems or are prone to falling, including seniors and people with conditions like Parkinson's disease.
AuroraMAX
Our atmosphere
Planet Mars
Science in weightlessness
Solar-Terrestrial Sciences
Stratospheric balloons | 科技 |
2016-40/3982/en_head.json.gz/18444 | Elbit Systems Introduces IMOD Future Trainer Aircraft Program
Elbit Systems and Israel Aerospace Industries (IAI) have teamed up to establish and operate a new flight training center to train Israeli Air Force pilots with the Aermacchi M3461 training aircraft. Advanced Flight Training (TOR) is a new ground-based flight training center based in Israel that is expected to begin operation in mid-2014, Elbit said. The center will feature airborne training with the Elbit Embedded Virtual Avionics (EVA) system, which provides training for operating radar optical sensors, early warning systems, virtual weapons training and more.
"Once again, Elbit Systems is honored to play a part in the training of the IAF’s flight school. The selection of this array of flight trainers and simulators for the IAF's new trainer aircraft is yet another vote of confidence in our extensive training and simulation capabilities,” said Alon Afik, vice president for training and simulation at Elbit Systems. Please enable JavaScript to view the comments powered by Disqus.
NTSB to Brief Summit on Fuel Tanks, Recorders, Public Safety Helos
Offshore Leader Bill Chiles to Keynote Cert. Summit
Airbus Sees Slow, Steady Global Market Growth | 科技 |
2016-40/3982/en_head.json.gz/18540 | European Broadband Industry Sets Sights On Flexible EU-U.S. Net Neutrality Approach
From Telecommunications Law Resource Center™
The Telecommunications Law Resource Center is the most comprehensive reference and news platform for communications law, covering broadcasting, cable, broadband, telephony and wireless;...
By Joe Kirwin
May 14 — European broadband providers are pushing hard for European Union regulators to seek a trans-Atlantic approach to open Internet rules at a time when the Federal Communications Commission has been forced to reconsider U.S. net neutrality rules.
The FCC may vote May 15 on whether to open comment on a proposal to revamp its 2010 net neutrality rules after the U.S. Court of Appeals for the District of Columbia Circuit said the agency's no-blocking and nondiscrimination rules too closely resembled common carrier regulations (Verizon Commc'ns Inc. v. FCC,D.C. Cir., No. 11-1355,1/14/14).
“The timing is perfect for industry and regulators from both sides of the Atlantic Ocean to seize the innovation and investment opportunities presented by our sector.” Matthias Kurth, Cable Europe
Groups such as the European Telecommunication Network Operators and Cable Europe say that a recent vote in the European Parliament establishing strict net neutrality rules are outdated and will further slow the European digital economy, which already lags behind that of the U.S. and some parts of Asia.
“The timing is perfect for industry and regulators from both sides of the Atlantic Ocean to seize the innovation and investment opportunities presented by our sector,” said Cable Europe Executive Chairman Matthias Kurth in a recent speech delivered to a cable industry conference in California. “The cable industry is fully in support of an open Internet. Yet what we are witnessing is a real risk that in a desire to ensure an open environment regulatory constraints will stifle the most innovative and exciting developments.
“We should learn from one another to reach a best practice mode and it is for industry to drive this process,” Kurth said.
EU Overhaul Legislation
In April, the European Parliament adopted last-minute amendments to the pending EU regulatory telecommunications legislation to overhaul its regulatory framework that would restrict broadband operators from offering specialized services at high speeds to content providers such as Apple Inc., Google Inc. or online movie services such as Netflix Inc.
“If these restrictive changes to the open internet provisions are confirmed, the European digital economy will suffer and EU businesses will be put in a difficult competitive situation with respect to other regions of the world, especially the United States,” ETNO Chairman Luigi Gambardella told Bloomberg BNA. He added that the EU must fall in line with the FCC because it “wants to recognize that innovation is the very essence of the internet.” ETNO is the Brussels-based European Telecommunications Network Operators' Association.
For consumer and public advocacy groups, the European Parliament amendments are hailed as a major victory that will prevent the Internet from becoming a “pay-to-play” market place as opposed to an open, egalitarian utility.
However, the EU legislation awaits a vote in the Council of Ministers in the second half of 2014, now the focus of intense lobbying. The EU broadband industry, as well as content providers, are calling for EU member states to back the net neutrality approach outlined in the original European Commission proposal put forward in September 2013.
“In Europe, we need to take a decision on whether we are to be part of the future innovations and put our continent back on the internet map,” Gambardella said. “To this end, the upcoming work of the Council of Ministers on EU’s net neutrality rules will be key in helping Europe regain its digital leadership.”
Content Provider Critics
Not only the Internet service providers support a more flexible net neutrality approach. Even major content providers such as Google Inc., Apple Inc. and others—who have petitioned the FCC not to allow broadband companies to charge more for delivering services at higher speeds—are not happy with the European Parliament position.
“The IT industry would ideally like the U.S. and the EU to take a similar approach on this question,” said a lobbyist representing companies such as Apple, Google, Cisco Systems Inc., Oracle Corp. and others, who spoke to Bloomberg BNA on condition of anonymity. “But the European Parliament position does not help. We are urging member states to re-work this legislative proposal, stripping out much of the technical detail and giving a clearer line regarding the broader political debate around net neutrality.”
According to EU diplomats and industry officials, the vote in the Council of Ministers will reflect the different approach that EU member states are expected to take compared with the one approved in the European Parliament. France and Germany have already voiced their dissatisfaction with the European Parliament stance.
“EU member states do not have an election to worry about,” said EU diplomat, who spoke to Bloomberg BNA on the condition of anonymity, referring to May 22-25 European Parliament elections. “So the approach of member states is much less political and more focused on the economic reality.”
More Nuanced Approach
From the European Commission vantage point, the proposal it put forward in September 2013 is a compromise that meets the concerns of consumer groups and others worried that pay-for-play platforms would leave many in a slow lane while more affluent users benefit from high-speed lanes that content providers have negotiated with Internet service providers. The EU executive body also believes that the FCC should and probably will adopt a flexible approach in line with the European Commission proposal.
“Like the Commission, the FCC in the United States recognizes that you can not take a fundamentalist approach to network management,” European Commission spokesman Ryan Heath told Bloomberg BNA. “However we believe our proposal gives strong protections and necessary commercial flexibility. It does not need to be watered down in light of U.S. developments.”
One of the reasons the European Commission took a more nuanced approach to net neutrality was the recognition that investment by European telecom operators in high-speed networks was falling dangerously behind other parts of the world. Although the European Commission has resisted for years arguments that the incumbent network operators such as Deutsche Telekom AG, Orange S.A. and Telefonica S.A. could exclude new entrants from using their newly constructed high-speed networks, it accepted that new revenue is required to make the needed multibillion-dollar investments.
Challenging Antitrust Limits
The need for new revenue, the European Commission says, is especially true since the same legislation proposed in September 2013, calling for a nuanced, flexible net neutrality policy, would ban the controversial roaming fees European operators have charged to process cross-border telephone calls.
In some ways, the debate in the EU over net neutrality rules ties in with a parallel, highly divisive issue in the EU telecom sector that also has trans-Atlantic implications. European telecom operators say that EU competition rules that prevent the formation of pan-European carriers that can match the likes of AT&T Inc. or Verizon Communications Inc. are a further reason that a relaxed approach to Internet neutrality rules are needed. Currently, the European antitrust authority holds that fewer than three or four telecom operators in each EU member state would put consumers at a big disadvantage.
Although the anti-consolidation position by EU antitrust officials is now being questioned—the current candidates for the next European Commission presidency have signaled a change as has German Chancellor Angela Merkel—European operators say a U.S. approach to net neutrality that gives them the right to earn more revenue would give the U.S. a greater advantage. In theory, this issue could be addressed in the Transatlantic Trade and Investment Partnership (TTIP), EU telecom operators say.
However, the European Commission as well as the telecom operators and content providers say TTIP will probably take years to be completed.
“We need solutions now,” said the EU lobbyist working for the content providers. “This is an urgent issue and this sector can not wait for TTIP—although it is something we support. But in the meantime, it is important that the FCC approach and that of the EU will create a level playing field. Otherwise, EU businesses will be put in a difficult competitive situation with respect to other regions of the world.”
To contact the reporter on this story: Joe Kirwin in Brussels at [email protected]
To contact the editor responsible for this story: Heather Rothman at [email protected]
Try Telecommunications Law Resource Center™ now | 科技 |
2016-40/3982/en_head.json.gz/18677 | Newsletter CISB n.1 | December 2011
The first of many meetings
First Annual CISB Meeting defines paths for cooperation via open innovation between Brazil and Sweden and promotes a space for integration
With the purpose of starting a fixed calendar of debates about the cooperation between Brazil and Sweden and strengthen bilateral innovation networks, the Swedish-Brazilian Research and Innovation Center (CISB) held in November its First Annual Meeting. Nearly 130 professionals related to universities, industries and governments from both countries gathered for two days in Sao Paulo to discuss the possibilities for joint work and the role the organization will play in this process.
Created six months ago, CISB had gathered partners in different moments, but for the first time it brought together so many players for a single discussion. Bruno Rondani, Center’s Executive Director, explains that the organization was born to be the bridge connecting common points of interest between different players. “CISB is a neutral element in the mediation of actions for technology exchange and generation of innovation. Its members bring challenges and projects for each industry and country, and from this the Center creates the necessary connections for the solutions, articulating meetings with investors and alternatives for financing,” he states.
Carlos Costa, Specialist in International Projects for Brazilian Agency for Industrial Development (ABDI), believes that the creation of CISB adds value to an ongoing process. According to him, Sweden has been considered a strategic partner by the Brazilian government, but only in 2005 the contacts with the country were intensified. In 2009, ABDI created a specific project for technology and innovation collaboration for the country.
For him, the entrance of CISB to this articulation boosts the results of the approximation. “The creation of CISB within the context in which Brazil gets close to Sweden is highly welcome, since it enhances the institutional level and helps maintain an ongoing relationship. Thus, CISB creates a real possibility of advancing this agenda and connecting important projects for both nations,” he states.
Currently, the organization has six institutions admitted as members, and there are 40 letters of intention signed by future partners. On the first phase, CISB has mainly congregated Sweden members, but the participation remains open to Brazilian interested parties and shall solidify over the coming months.
In addition to establishing specific discussions on how these partnerships will be made, the First Annual CISB Meeting was an opportunity for professionals related to the Center to come into contact with each other and exchange experiences. Even people from same-country organizations met for the first time or started a contact networking that will be positive for establishing partnerships. For Semida Silveira, professor of the Royal Institute of Technology (KTH), the meeting was important to increase the understanding about the role played by CISB and by each partner. “We are starting to execute ideas and, in this process, the role CISB plays becomes clearer. Now, how the projects will be applied must be defined,” she says.
Top »
Agreement based on complementariness
With different profiles, Brazil and Sweden find opportunities for working together
When questioned about the reasons for establishing cooperative works, the CISB partners are unanimous in affirming the complementary nature between Brazil and Sweden. The Swedish experience in leading-edge technology, collaborative and innovative projects can partner in a promising manner with the booming Brazilian market. “We are highly interested in developing an economic relationship applying our expertise in innovation with Brazil’s broad industrial base,” exemplifies Magnus Robach, Sweden’s Ambassador to Brazil.
Both countries have government agencies that work for fostering innovation. In Brazil, this role is played by ABDI. The organization works via public industry development policies along with policies on science, technology, innovation and foreign trade. It is directly related to the application of the Brasil Maior [Greater Brazil] plan – third industrial policy program launched, now focused on innovation – that aims boosting the current growth of the Brazilian economy into a more solid and lasting scenario. “The purpose of Brasil Maior is to maintain the steep growth rate through insertion of innovation generation variables,” defines Carlos Costa, Specialist in International Projects from ABDI. “The major challenge is to maintain connections that are both independent and interdependent– that is, ones must develop autonomously (without governmental support), but maintain a level of cooperation,” he says.
One of the paths for Brazil to become increasingly robust is establishing international partnerships. For this reason, CISB maintains a close relationship with Vinnova, Swedish governmental agency for innovation systems. With vast experience in collaborative projects, Sweden can add great value to the Brazilian experience. According to Ciro Vasquez, International Collaboration Officer for Vinnova, the cooperation culture developed in the country attached to its peculiarities. “We are a small country in extension and population – it is not a big market. For this reason, we need to create relationships and become a reference in innovation to stand out and attract companies from all over the world,” he says.
One of the main examples of this experience is the Lindholmen Science Park, a science and technology park located in the city of Gothenburg (Sweden), where players from the triple helix of government-academia-industry coexist in broad integration. The park’s proposal is to not only promote proximity among its occupants, but also a physical coexistence, since different spaces are created for the joint development of ideas. The proposal of arenas that CISB adopts is inspired in this model, which understands the need for maintaining spaces for meeting and evolution of collaborative projects.
To become a strong economy, the Swedish bet on joint development with multinational companies and, thus, became the gateway for the European market. In spite of the progress, the country is still facing challenges, such as increasing the share of small and medium businesses in the investment in innovation, as well as optimizing the result of investments made in research for the economy. Brazil enters this scenario as a promising market for the Swedish economy, as well as an important commercial and technological partner.
For Marcos Vinícius de Souza, Director of Innovation for the Brazilian Ministry of Development, Industry and Foreign Trade, Brazil plays a double role in the global economy as a producer and consumer, which results in an attractive scenario for developing partnerships. “We are among the top ones in the commodities market, but we still need to strengthen the production chains to reach higher added value,” he states. “For this reason, we bet on Sweden, which has an extremely compatible culture of cooperation. We will attract not just facilities, but also knowledge,” he claims.
Collective thought Divided into groups by theme arenas, professionals discussed key issues related to the main challenges that can be worked through open innovation
One of the guiding points of the First Annual CISB Meeting was the creation of the Security and Transport arenas. They were organized from the identification of major theme areas related to challenges and possibilities of work for both countries. The concept of arena is based on the proposition of common discussion environments for the triple helix members. In March 2012, the Energy arena shall also be launched by CISB.
At the Security Arena, the professionals looked for alternatives for collaboration projects related to security issues. The program has the goal of reinforcing the ability to handle crises and capacity of facing social problems. In addition to presenting the Swedish model of research and development in security, safety alternatives for urban services such as infrastructure, transportation and water supply, information and communication systems, and healthcare-related services were discussed.
The Transport Arena addressed some of the main challenges faced by the Brazilian and related governments such as urban traffic, renewable fuels and long-distance transportation. Jurandir Fernandes, State Secretary of Metropolitan Transportation for the Sao Paulo government, highlights the relevance of this discussion. He believes that mobility is an urgent challenge in Brazil. “The Metropolitan Region of Sao Paulo is the third biggest urban stain in the world, only behind Tokyo and New Delhi. There are complex issues to be solved and, for this purpose, it is necessary to innovate. It is clear that we will not solve a growing problem like this one using more of the same. The public managers need courage to implement the unusual,” he states.
For Per-Arne Eriksson, Director of Product Development and Quality of Scania, a promising trend for bilateral work is the research with biofuels. “Brazil is strong when it comes to alternative fuels, such as ethanol, which we are highly interested in studying as an option for buses and trucks. More than an issue of technology development, the application of this knowledge in the market also depends on factors such as fuel price stability,” he says.
A good example of collaborative project was presented by Niklas Berglin, Senior Project Manager for Innventia. Named Polynol, the initiative includes using the paper industry’s infrastructure to produce polymers and fuels from renewable sources. According to him, the key factor for the success of this type of project is the involvement of all partners throughout the industry’s value chain, for establishing a strong and cohesive group with collaboration from universities in Brazil and Sweden, as well as governmental institutions as financers.
Capability Development Centre
Another discussion group addressed the concept of Capability Development Centre (CDC), a tool that Saab and CISB plan to implement in Brazil. The Centre is a facility with adoption of real-time simulation systems that allow different competencies to group in order to discuss and organize the knowledge available for integrating complex systems, such as defense and safety systems, or for the organization of complex operations, such as the World Cup or the Olympic Games. The model has never been adopted in Brazil and is based on the Swedish and South African experiences.
During the meeting, Piet Verbeek, general retired from the South African National Defense Forces, introduced the CDC as an alternative to develop government solutions in the security segment. The system is used as a tool to demonstrate concepts and scenarios to government clients, establishing bases for integration, tests and assessments, as blocks being built according to the need. The architecture of the South African CDC is comprised of a laboratory environment and planning sessions. With the system, the country created a Project and Validation Center, used as a source of knowledge for developing solutions throughout product lifecycles and present operations (military, for example) to representatives from government organizations and the Army.
With the proximity of the World Cup and the Olympic Games in Brazil, the model is convenient for developing solutions based on multidisciplinary teams. According to Johannes Swart, Campaign Leader for Sisfron & Sisgaaz projects in Brazil, responsible for Saab’s CDC in South Africa, the CDC can be understood as a knowledge library in which people from different areas who meet for a single purpose are more important than the material accumulated. For him, the sooner one starts to work on this model, the more prepared the team is when it needs to work.
Swart highlights that, in South Africa, in addition to being used for demonstrations to FIFA and to other players involved, the CDC helped developing command and control systems for the 2010 World Cup, with the creation of the JOPS – Joint Operating Picture System. With the gradual development of the JOPS, in order to adjust to the countdown for the 2009 Confederations Cup and to the 2010 World Cup, drills were conducted displaying the difference between the usual time for understanding a crime situation and the need to notice this in real time. Swart states that the reaction needs to be immediate in such an event to prevent, for example, an aircraft that represents a potential hazard from entering the restricted airspace of a soccer stadium, when it is too late for an action that avoids collateral damages (taking the airplane down near a stadium may have tragic consequences for the public on the ground).
The meetings also addressed ways to get fundings. In addition to creating discussions on alternatives for joint project development, the Open Innovation Arena & International Collaboration gathered representatives from Brazilian fostering agencies and had four CISB members – Innventia, Saab, Scania and SP – exhibiting their main innovation ideas and projects. In addition to them, the chairs of the Security and Transport arenas presented challenges in those areas, which were discussed by representatives from Fapesp, CNPq, CAPES and the Brazilian Ministry of Foreign Relations. Also attending were representatives from the Inter-American Development Bank, ABDI and the Brazilian Ministry of Development, Industry and Foreign Trade, as well as Magnus Robach, Sweden’s Ambassador to Brazil. In addition to representatives from fostering agencies hearing about the expectations from Swedish companies for technology development in Brazil, the arena gave opportunities for agencies to introduce themselves. They displayed their main fostering programs and discussed their role in the context of international collaboration, raising realistic options for collaboration in the Brazilian context. The highlights were the collaborative programs from Fapesp, addressed by Sergio Queiroz, Innovation Coordinator, and the program Science without Borders, by CAPES and CNPq. For Bruno Rondani, Executive Director of CISB, it became clear that Brazil presents many opportunities for Swedish activities. “As in any country, each of these agencies has peculiarities for creating financing lines or making available resources for research and development. CISB is at a position of helping Brazilian and Swedish partners make feasible operation models that adjust the best to these profiles,” he explains.
Exchange for development
CISB signs agreement with CNPq and Saab for creating scholarship grants in Sweden for Brazilian researchers
An important highlight of the First Annual CISB Meeting was the signature of the cooperation agreement between CISB and CNPq in the scope of the program named Science without Borders, from the Brazilian Government, which aims at promoting the student exchange. Both institutions, along with Saab, also signed an addendum that creates 100 scholarship grants for students and researchers in Sweden. Targeted at doctorate and PhD students, specialists and visiting professors, they will be co-financed by CNPq and SAAB. The peculiarity of this exchange is in the focuses on collaborative research and development between Brazil and Sweden, related to the possibilities for application in the market.
Paulo Beirao, Director of Agricultural, Biological and Health Sciences for CNPq, believes that the document is representative. “This agreement illustrates what we want to achieve, because it displays common interests that led to such a fruitful agreement, which can and must be reproduced,” he says. A considerable gain for researchers, in addition to the academic relevance, is the expansion of network and knowledge of the Swedish context that may benefit the Brazilian industry.
The ceremony also featured the signature of five letters of intention from Swedish universities and members of CISB for future participation on the program. CISB has developed a strategic work in enabling exchange programs for Sweden in agreement with the work from the federal government. In total, the program Science without Borders forecast 75,000 scholarship grants. The goal is to target 1,000 from this total to Sweden throughout five years via articulation from CISB.
OPEN INNOVATION SEMINAR
Swedish presence at the OIS
Swedish professionals stand out at the schedule of the Open Innovation Seminar 2011
The First Annual CISB Meeting integrated the schedule of the Open Innovation Seminar (OIS), the biggest event in Latin America dedicated to innovation networks. In addition to the presence at the CISB Arena, Swedish professionals and organizations stood out with participation in lectures and debate panels at the OIS.
Several Swedish organizations, such as Scania, SP and Innventia, took part on the debates, both as listeners and as lecturers. The model from the Lindholmen Science Park was a highlight through the presentation by its CEO, Niklas Wahlberg, displaying how it is possible to promote open innovation in its different phases via broad integration mode. Wahlberg also integrated the panel “Innovation for regional development”, from which Pontus de Laval, Chief Technology Officer for Saab, and Joakim Appelquist, Director of International Collaboration and Networks for Vinnova, also took part. Vinnova was the topic of a lecture in which its method of operation and experience in international collaboration were introduced to the Brazilian audience. According to Appelquist, international cooperation is a relationship that may not be simple, but is fully possible. For him, the important in this type of partnership is to deeply know the context of the country with which one has a relationship in order to pursue common goals and the best ways for both to gain with the exchange, adapting to different opportunities.
Additionally, Brazilian professor Semida Silveira called attention on the event both for the projects she leads and for the strategic position in the Brazil-Sweden relationship she plays, since she knows the reality of both countries. She is the head of the Division of Energy and Climate Studies of Sweden’s Royal Institute of Technology. Semida participated on the panel “Challenges in building innovative cities” and presented Stockholm’s experience in innovation projects for the urban challenges, using as an example the Hammarby Sjostad district, where water and power consumption are managed integrated to residue handling. The Swedish academia world was also represented by Niklas Berglin, Senior Project Manager of Innventia, who gave the lecture “Multinationals and global innovation in knowledge-intensive industries”. Another highlight was the participation from Saab at the event with the presentation of Gripen, one of the most advanced fighter aircrafts in the world. With a flight simulator assembled in an area near the debate rooms, the company attracted nearly 1,200 people interested in its operation and innovation. According to Pontus de Laval, Gripen is an example of the successful application of the triple helix, and the company’s proposal is for the model’s new generation to be developed in partnership with Brazil via Brazilian Air Force’s F-X2 program, which includes the modernization of its equipment units via technology purchase and transfer. | 科技 |
2016-40/3982/en_head.json.gz/18692 | "If you just keep pushing the Chinese that they've got to make some kind of a commitment for cuts or reductions in emissions intensity, you're not going to get anywhere," he added. China has demanded that developing countries cut their emissions by at least 40 percent from 1990 levels by 2020. It also wants rich countries to donate up to one percent of their annual gross domestic product to help poorer countries tackle climate change. "I think that's an opening gambit in a set of negotiations which in all will ultimately be decided by compromise from all those that are involved. I don't think it defines what's going to happen ultimately," Pachauri said. "China will be certainly be persuaded to accept something lower, I have no doubt about it," he added. Last week, Japan became the latest developed country to publicly commit to specific cuts in carbon emissions. Its vow to reduce emissions by 15 percent on 2005 levels by 2020 was lambasted as lacking ambition, and is a fraction of the cut scientists say is necessary to prevent dangerous climate change. "I think that's not going to be the final word," Pachauri said of the Japanese commitment. "Who knows where the developed world as a whole will end, whether it will be 20 percent or 25 percent or more, but all of this is at least in the realm of possibility given the positive direction that I see coming out of Bonn." Two weeks of talks in Bonn ending last Friday brought together delegates from 182 countries to lay the groundwork for a global climate change deal to replace the Kyoto Protocol, which expires at the end of 2012. It's one of a series of meetings scheduled in the lead up to the U.N. Climate Change Conference in Copenhagen on December 7, seen as the most important climate change talks since the Kyoto deal was adopted in 1997. Pachauri told CNN the progress made at the latest round of talks in Bonn bodes well for a global deal in December. "I think the whole spirit of the discussion it seems to me as being far more productive than one would have anticipated, particularly given the fact that the U.S. is now engaged fully in this part of these discussions in an active way," he said.
Top NewsSenators 'troubled' after Rice meetingBergen: Senseless Benghazi obsession Today's Featured Product:
2011 BMW Z4 sDrive35is
• Check Prices
• Read Review
Recent Product Reviews:
RIM BlackBerry Torch 9800 (AT&T)
Motorola Rambler - black (Boost Mobile)
Samsung UN46C6500
CNET.com Ratings »
more products reviews-----------------CamcordersCamerasCellphonesComputersHandheldsHome VideoMusicPeripheralsWi-Fi | 科技 |
2016-40/3982/en_head.json.gz/18705 | California, Florida and Indiana e-waste Recycling Facilities Earn e-Stewards Certification
Tuesday, March 1, 2011 - 10:41amBasel Action NetworkContact: Neil Peters-Michaud, CEO Cascade Asset Mgt
Email: [email protected]
Lauren Roman, e-Stewards Business Director
Email: [email protected], Florida and Indiana e-waste Recycling Facilities Earn e-Stewards Certification
Cascade Asset Management offers coast-to-coast e-Stewards IT asset disposition solutionsMADISON, WI - The Basel Action Network (www.BAN.org), a global toxic trade watchdog organization, announced today that the Florida, Indiana and California technology equipment refurbishing and recycling centers of Cascade Asset Management (Cascade), are now e-Stewards Certified®, and thus adhere to the world’s highest standard for socially and environmentally responsible recycling. With this announcement, California and Florida now have their first e-Stewards Certified recycling option. Cascade’s Wisconsin headquarters became e-Stewards Certified in 2010. Now all of Cascade’s recycling facilities are e-Stewards Certified.
Cascade’s four processing centers provide a coast-to-coast solution to institutions and enterprises looking for secure and responsible IT asset disposition. In the last two years, Cascade has been able to complete over 18,000 on-site pickups at locations across all fifty states. Since 1999, Cascade has handled over 62 million pounds of unwanted electronics for reuse and recycling. “In the absence of leadership or laws requiring responsible electronics recycling here in the US, BAN created a program to help organizations and individuals find responsible electronics recyclers. Certified e-Stewards Recyclers have undergone a rigorous independent audit process to ensure they are not simply dumping this toxic waste on developing countries under the guise of recycling that sadly is the norm in this industry,” said Jim Puckett, BAN’s Executive Director. “As a Certified e-Stewards Recycler, Cascade Asset Management can now assure customers their obsolete electronics will be managed to the highest standards of responsibility.”
The e-Stewards program is recognized by environmental organizations, government entities, and private enterprises as the gold standard for vetting electronics recyclers. The accredited, third-party audited certification program has not only been endorsed by Greenpeace USA, the Sierra Club, the Natural Resources Defense Council (NRDC), the Electronics TakeBack Coalition and 68 other environmental organizations, but has also drawn the support of major corporate “e-Stewards Enterprises” such as Wells Fargo, Samsung, Capitol One, and Bank of America.
“e-Stewards Certification validates Cascade’s ongoing commitment to responsible processing of IT assets,” said Neil Peters-Michaud, CEO and founder of Cascade. “It’s a rigorous standard that evaluates all our operations, systems, and downstream management to demonstrate conformance to the highest standard for responsibility in the industry.” ###
About Cascade
Cascade Asset Management has provided full service IT asset retirement solutions since 1999 with four facilities throughout the United States. The company collects, receives and processes office electronics from enterprises and institutions across North America. For more information about Cascade, please visit www.cascade-assets.com or call (888) 222-8399.
About the Basel Action Network
The Basel Action Network (BAN) was founded in 1997 and named after the Basel Convention, the United Nations treaty that restricts trade in hazardous wastes and was intended to stop the dumping of toxic waste on developing nations. In the last decade, BAN has exposed the toxic trade issue to the world via investigations, reports and documentary films on two of the largest illegal hazardous waste streams traded internationally today: electronic waste and toxic ships destined for ‘recycling’ in developing countries. Today, BAN is not only the leading global source of information and advocacy on toxic trade and international hazardous waste treaties, but it has also developed market-based solutions that rely on the highest standards for globally responsible recycling and rigorous independent certification to those standards. For more information on BAN and the e-Stewards program, visit www.e-stewards.org.
###Organization Links
Basel Action NetworkShare This Article | 科技 |
2016-40/3982/en_head.json.gz/18708 | White House killing NASA's moon mission, reports say
Budget plan calls for boosting NASA budget, creating private space shuttle service
Fla. senator hits White House over reported NASA budget plan
NASA probe crashes into moon in hunt for water
NASA finds 'lots of water' from moon crash tests
By Sharon Gaudin
Reports surfacing this week say that the White House plans to put a stop to NASA's plans to return to the moon.The Orlando Sentinel, quoting an unnamed White House source, reported yesterday that President Barack Obama is looking to push the space agency in a new direction.David Steitz, a spokeperson at NASA Headquarters in Washington D.C., said he wouldn't comment on such reports until the White House budget proposal is announced. The plan is expected to be released on Monday.NASA has been looking to not only return astronauts to the moon, but also to build a lunar outpost there by 2020. The NASA plan includes first sending next-generation robots and machines to the moon to create a landing area for spacecraft, and a base where humans can live. NASA scientists have been preparing what the agency calls the Constellation moon landing plan, which was set forth by former President George W. Bush.In June, NASA launched two lunar satellites as the opening act in the long-term mission to send humans back to the moon. The satellites -- the Lunar Reconnaissance Orbiter and the Lunar Crater Observation and Sensing Satellite - were designed to provide them with new information about the moon.In an October NASA mission, the Lunar Crater Observation and Sensing Satellite, known as LCROSS, slammed into the moon in an attempt to kick up what scientists believe is water ice hiding in the bottom of a permanently dark crater. Scientists have been hoping that if a human outpost is created on the moon, people there could have access to water there instead of having to haul it up from Earth. Plans to return to the moon have been in question since the Obama administration last May called for an independent review of NASA's human space flight activities. The Orlando Sentinel also reported yesterday that the White House budget plan appears to boost NASA's budget by some $5.9 billion over the next five years. Some of that money, according to the report, would be to keep the International Space Station running. The rest would go to set up contracts with private companies to act as a sort of shuttle service, taking astronauts back and forth from the space station after NASA's space shuttle fleet is retired.Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at @sgaudin, send e-mail to [email protected] or subscribe to Sharon's RSS feed .
Sharon Gaudin — Senior Reporter
Sharon Gaudin covers the Internet, social media, cloud computing and emerging technologies for Computerworld. | 科技 |
2016-40/3982/en_head.json.gz/18747 | Trustyd Busted: Cloud Backup Appliance Vendor Trustyd Closes Doors byJoseph F. Kovar on November 11, 2013, 5:39 pm EST
Pages12next ›last »
Data protection appliance vendor Trustyd has closed its doors and sent its intellectual property to one of its investors, the State of Ohio's Department of Development.
Trustyd, founded in 2012 to acquire and nurture another storage startup, 3X Systems, was unable to turn around its data protection business fast enough to survive, said Robert Gueth, president of the Dublin, Ohio-based company.
"3X Systems was struggling," Gueth said. "Trustyd thought there was an opportunity to acquire the assets, redevelop the company, grow its base of resellers, and turn it around. But it took too long."
[Related: Spectra Logic Intros Deep Storage Appliance To Ease Long-Term Archiving]
Trusted was formed in 2012 with investment from the State of Ohio Department of Development and two other private investors in order to acquire 3X Systems, Gueth said.
With its acquisition of 3X Systems in June 2012, Trustyd received a series of remote backup appliances that help solution providers set up private cloud storage systems for SMB customers.
The appliances allowed Trustyd's channel partners to deliver on-premise and private cloud-based disaster recovery solutions to businesses that lack the resources to do it on their own. This included immediate, automatic backup of traveling laptops, as well as traditional on-premise servers, desktops, databases and applications.
It was a good product line, said Jim Whitecotton, project manager at HGO Technology, a Wheeling, W.Va.-based solution provider that partnered with 3X Systems and then Trustyd for nearly three years.
HGO had about 50 or 55 clients using the Trustyd technology, Whitecotton said.
"We would sell the appliance directly to our clients, or we had a few we kept in-house for use in selling chunks of backup space to multiple clients," he said. "It worked fairly well. Once installed, the appliances did a good job for us overall."
About one-third to one-half of HGO's Trustyd clients are small businesses that share space on the solution provider's appliances for their backup and recovery, Whitecotton said.
Starting in 2013, Trustyd seemed to be less responsive than in the past, Whitecotton said.
"But their shutting down completely blindsided us," he said. "It took us one-and-a-half weeks to contact anybody. We had two clients committed to buying two more appliances."
NEXT: The Trustyd Appliances Work For Now, But No More Support, Upgrades
The appliances clients have purchased still work fine, HGO's Whitecotton said. "One issue is that clients run out of capacity," he said. "They need a software key from Trustyd to unlock capacity, but now they can't get it. You would think they could at least provide the keys we need to unlock the extra space. The capacity is there, but it's not accessible without the keys."
Going forward, a big issue could come from a case where the database for the backups gets corrupted, Whitecotton said. "Unless we can find someone to repair it, customers could lose data," he said. "So we're looking for an alternative solution. We will do what we can to make sure clients don't get hurt."
Migrating data from the Trustyd appliances is not particularly easy, Whitecotton said. "The way their appliance is made, the data has to be restored to a server it thinks is the same server that originally created the data," he said.
Trustyd's Gueth acknowledged that customers will be disappointed by the closure of the company.
"Trustyd's operations have ceased," he said. "There are no more operations going on. It's a challenge for customers or resellers or IT consultants. They will have to deal with system support issues, and move to another supplier. How easy that is depends on circumstances. They can spin up a new server and back up the data. But certain policy implementations and backup capabilities are hard to replace."
The Trustyd appliances are self-contained appliances, and as long as there are no service issues with the appliances, getting data off the appliance is as easy as before, Gueth said. In the meantime, the appliances can continue to be used for backing up the data.
Going forward, Gueth said, there will be no more support or upgrades, and some patented features such as the Web service that allowed mobile PCs and the backup appliances to find each other wherever they are moved to will not be available.
For HGO, the closure of Trustyd means finding another way to help customers with their data protection requirements, Whitecotton said.
"We've been around for a while," he said. "We've never had something this extreme before. Some customers have been doing business with us for 25 years. They're not showing up at the door with torches and pitchforks yet. They're concerned. But they know we're going to find them the right solution."
PUBLISHED NOV. 11, 2013 | 科技 |