Friday, November 18, 2016

Here’s What Apple’s Next iPhone Might Do

Apple’s iPhone and iPad could deliver some major improvements next year, analysts say.
The tech giant’s next iPhone, which could be known as the iPhone 8, will ship with bigger screen sizes, Barclays analysts Blayne Curtis and Christopher Hemmelgarn said in a note this week. They say the iPhone 8 will offer a 5-inch screen and iPhone 8 Plus will deliver a 5.8-inch display, according to Business Insider, which obtained a copy of the note. The iPhone 7 and iPhone 7 Plus offer 4.7- and 5.5-inch screen sizes.
The larger screens would be offered if Apple decides to offer a “bezel-less design” in the iPhone 8, according to the report. Rather than stop the screen before the left and right edges, Apple is expected to offer a design that has the iPhone 8’s display spill over the sides, creating an end-to-end touch surface. On the left and right side, the iPhone 8’s screen would curve, similar to the “edges” in the Samsung Galaxy S7 Edge.
In order to deliver that feature, the iPhone 8 will need to ship with organic light-emitting diode (OLED) screen technology. It would be the first from Apple to offer OLED. All iPhone versions through the iPhone 7 have run on LCD technology.
“By using a plastic OLED screen, the screen can bend at the edges to allow a curved bezel-less edge,” the analysts reportedly told investors.
Their comments come after several rumors have surfaced saying Apple is working on an OLED-based curved screen. However, Apple has yet to even confirm it’s working on a new smartphone, let alone new display technology, and the analysts acknowledged that the company hasn’t yet made a final decision on the next iPhone’s screen. But evidence suggests Apple’s next iPhone will come with a big display update.
In regards to Apple’s iPad, the analysts said that the company will release new tablets in March. Like the iPhone, the new iPads are expected to come with a “bezel-less” design and offer displays that stretch from one side to another. The 9.7-inch model would become the “low-price” version and complement the higher-end 12.9-inch iPad Pro. What’s more, Apple could announce a new 10.9-inch iPad that would feature the bezel-less design.
Apple currently sells 7.9-, 9.7- and 12.9-inch versions of its iPad. Some had anticipated Apple updating the tablets this year, but the company focused instead on a new iPhone and new MacBook Pro.
There is some debate over when a bezel-less iPad might surface. Although the Barclays analysts say it’s only a matter of months, a KGI Securities analyst said in a research note recently that Apple likely wouldn’t offer OLED and thus, a curved display, in the iPad until 2018.
Apple hasn’t confirmed any of the rumors and did not immediately respond to a request for comment.
This article originally appeared on Fortune.com

Tuesday, November 15, 2016

Geologists discover how a tectonic plate sank

In a paper published in Proceedings of the National Academy of Sciences (PNAS) Saint Louis University researchers report new information about conditions that can cause Earth's tectonic plates to sink.
John Encarnacion, Ph.D., professor of earth and atmospheric sciences at SLU, and Timothy Keenan, a graduate student, are experts in tectonics and hard rock geology, and use geochemistry and geochronology coupled with field observations to study tectonic plate movement.
"A plate, by definition, has a rigidity to it. It is stiff and behaves as a unit. We are on the North American Plate and so we're moving roughly westward together about an inch a year," Encarnacion said. "But when I think about what causes most plates to move, I think about a wet towel in a pool. Most plates are moving because they are sinking into Earth like a towel laid down on a pool will start to sink dragging the rest of the towel down into the water."
Plates move, on average, an inch or two a year. The fastest plate moves at about four inches a year and the slowest isn't moving much at all. Plate motions are the main cause of earthquakes, and seismologists and geologists study the details of plate motions to make more accurate predictions of their likelihood.
"Whenever scientists can show how something that is unexpected might have actually happened, it helps to paint a more accurate picture of how Earth behaves," Encarnacion said. "And a more accurate picture of large-scale Earth processes can help us better understand earthquakes and volcanoes, as well as the origin and locations of mineral deposits, many of which are the effects and products of large-scale plate motions."
Plate movement affects our lives in other ways, too: It recently was reported that Australia needs to redraw its maps due to plate motion. Australia is moving relatively quickly northwards, and so over many decades it has traveled several feet, causing GPS locations to be significantly misaligned.
Subduction, the process by which tectonic plates sink into Earth's mantle, is a fundamental tectonic process on earth, and yet the question of where and how new subduction zones form remains a matter of debate. Subduction is the main reason tectonic plates move.
The SLU geologists' research takes them out into the field to study rocks and sample them before taking them back to the lab to be studied in more detail.
Their work involves geological mapping: looking at rocks, identifying them, plotting them on a map and figuring out how they formed and what has happened to them after they form. Researchers date rock samples and look at their chemistry to learn about the specific conditions where an ancient rock formed, such as if a volcanic rock formed in a volcanic island like Hawaii or on the deep ocean floor.
In this study, Keenan and Encarnacion traveled to the Philippines to study plates in that region. They found that a divergent plate boundary, where two plates move apart, was forcefully and rapidly turned into a convergent boundary where one plate eventually began subducting.
This is surprising because although the plate material at a divergent boundary is weak, it is also buoyant and resists subduction. The research findings suggest that buoyant but weak plate material at a divergent boundary can be forced to converge until eventually older and denser plate material enters the nascent subduction zone, which then becomes self-sustaining.
"We think that the subduction zone we studied was actually forced to start because of the collision of India with Asia. India was once separated from Asia, but it slowly drifted northwards eventually colliding with Asia. The collision pushed out large chunks of Asia to the southeast. That push, we think, pushed all the way out into the ocean and triggered the start of a new subduction zone."
Their finding supports a new model for how plates can begin to sink: "Places where plates move apart can be pushed together to start subduction."
The SLU researchers now want to learn if their model applies to other tectonic plates.
"How common was this forced initiation of a subduction zone that we think happened in the Philippines?" Encarnacion said. "I would like to see work on other ancient subduction zones to see whether our model applies to them as well."
Other researchers on the study include Robert Buchwaldt, Dan Fernandez, James Mattinson, Christine Rasoazanamparany, and P. Benjamin Luetkemeyer.
Saint Louis University's Department of Earth and Atmospheric Sciences, combines strong classroom and field-based instruction with internationally recognized research across a broad spectrum of the physical sciences, including seismology, hydrology, geochemistry, meteorology, environmental science, and the study of modern and ancient climate change. Students also have the opportunity to work directly with faculty on their research and pursue internships through a growing network of contacts in the public and private sector.
Research centers include the Earthquake Center, the Cooperative Institute for Precipitation Systems, the Global Geodynamics Program, the Center for Environmental Sciences, and Quantum WeatherTM. The fusion of academic programs with world-class research provides students with an unparalleled opportunity to explore their interests and prepare for a wide variety of careers after graduation.

Story Source:
Materials provided by Saint Louis University Medical CenterNote: Content may be edited for style and length.

Journal Reference:
  1. Timothy E. Keenan, John Encarnación, Robert Buchwaldt, Dan Fernandez, James Mattinson, Christine Rasoazanamparany, P. Benjamin Luetkemeyer. Rapid conversion of an oceanic spreading center to a subduction zone inferred from high-precision geochronologyProceedings of the National Academy of Sciences, 2016; 201609999 DOI: 10.1073/pnas.1609999113

Tuesday, August 9, 2016

How the Tech Behind Bitcoin Could Revolutionize Wall Street

From its mysterious origin storyto its ties to black market dealings, Bitcoin has been closely watched since its emergence in 2009. But the digital currency hasn’t captured attention for controversial reasons alone. Many believe the underlying technology that powers bitcoin transactions, a system known as blockchain, has the potential to upend how Wall Street does business.
At its most basic level, a blockchain is a new means of structuring and distributing data. It allows financial companies and other institutions to create a digital ledger guarded by cryptography that can be shared among participants in a transaction. This makes it so that authorized participants can alter the ledger without awaiting approval from a central authority, often resulting in faster and more secure transactions that can save financial institutions time and money.
Bitcoin itself is in the throes of a tumultuous year, as the community is divided by deep philosophical differences. But some observers say blockchain will thrive regardless of bitcoin’s fate. More than 100 executives from major Wall Street firms likeCitigroupVisa, and Fidelity recently gathered at Nasdaq’s New York offices to experiment with blockchain. The event was hosted by Chain, a startup that specializes in developing blockchain systems for assets like corporate securities and loyalty points.
TIME recently spoke with Chain CEO and co-founder Adam Ludwin to learn more about blockchain and the potential it holds for Wall Street and beyond. Below is a transcript of our conversation. It has been edited for length and clarity.
TIME: The blockchain system was discovered and popularized through bitcoin. How will Wall Street use blockchain technology differently than the bitcoin system does?
Ludwin: This whole idea for a blockchain was invented to solve this double spending problem that had been challenging computer scientists for a long time. And in the bitcoin whitepaper, all of these concepts were put together in a very elegant way that launched a new Internet-based network that was open and decentralized.
The problem is, we have to adopt a new currency with bitcoin should we want to use it. So I can send bitcoin to Vietnam and that’s wonderful and very powerful, however that means I have to get bitcoin and my receiver needs to be willing to accept it.
And then beyond that, the senders and receivers have to be willing to essentially adopt the governance, scalability, and transparency features of bitcoin. And by the way, I don’t think bitcoin is a failure. I think bitcoin, despite all the controversy, is alive and well today. The reason that we’re doing something else is that we’re not interested in moving bitcoin. Bitcoin has already solved that. We’re interested in building new networks that move dollars, stocks, bonds, loyalty points, gift cards, you name it, in the same manner with the same speed and directness that bitcoin moves.
There has been a lot of buzz about how blockchain technology could make financial transactions faster, cheaper, and more secure. Can you go into more detail about how exactly it will do that, and where we’ll see the most impact?
[One example] is in the realm of international payments. If you look at how an international wire transfer today occurs, what we have is a messaging system called SWIFT, which is essentially a fancy financial email between banks. So messages get sent, and those messages trigger the movement of money from bank to to bank on several hops, depending on where the receiver is and where the sender is. And that whole process can take several days and can eat up several percentage points of the payment amount.
If we can put currency into a native digital format, instead of sending an electronic message, we can send the assets themselves electronically. The difference is what these cryptographic databases known as blockchains bring to the world. If we can do that, if we can send the assets themselves as opposed to just sending messages, then I can pay a supplier in Vietnam as fast as I can send an email to Vietnam.
The second example is in the capital markets. In the United States, on one hand we have high frequency trading, but once those trades are filled, we have extremely low frequency clearing and settlement systems. In other words, we can match an order in nanoseconds, but it still takes three days for the securities and cash to eventually swap between the counter-parties.
Just like in the international payments arena, we have a messaging system for sending trade information between institutions, and then we have a separate system for how we hold on to and move the assets themselves. And the difference between those messages and the actual asset movement are all of the steps, like authorization, clearing, settlement, error handling and so on, which are essentially the agreement of the middle and back office of Wall Street.
When people say blockchain technology will change clearing and settlement, what that really means is that blockchain technology will make clearing and settlement redundant. It’s as if I gave you a 10 dollar bill, and then asked you how do we clear and settle that payment. You would look at me funny, because it doesn’t make sense.
There’s also been some talk about how blockchain systems could be applied to other industries. A note from Goldman Sachs, for example, outlined how blockchain databases could be used to manage the identities of Airbnb hosts. What are your thoughts on that?
We view a blockchain functionally as a next generation database for financial assets. We think for most non-financial use cases, traditional database technology is suitable. And the reason for that comes down to what the fundamental purpose of using a blockchain architecture is. The purpose is: when you need to stretch your database over an entire market, so that when you move your assets from point A to point B, it does not leave a copy at point A. In other words, where we can make a recorded electronic asset function like cash, where I just hand you cash and we’re done.
But take, for example, a use case that gets thrown around a lot, which is health records. If I send my health records to my doctor, I also want a copy of my health records. Or if my general practitioner sends a copy of my health records to a specialist, do I want them to disappear from the general practitioner’s office? No, I don’t.
Wall Street is starting to experiment with blockchain technology. What are the biggest challenges these firms are facing so far with blockchain adoption, and do you think our financial system will ever run entirely on digital assets alone?
We’re definitely moving toward this model. There’s no doubt in my mind, based on the projects we’re working on and what we’re seeing. The main challenge right now is that institutions by and large are still asking the wrong questions with respect to blockchain technology. They’re asking, how can we use blockchain to streamline our middle and back office? How can we use blockchain technology to make our operation more efficient?
The question you need to ask as an executive is, what role do we want to play in that new network model? This is not just about change within one organization, but is about change across the market, and who’s going to be leading that charge.
Think about the music industry. If I say music industry, what pops into your head is probably companies like Apple and Spotify, and probably not record labels or live music. And even though at every stage it’s just music, as the medium has changed, the fundamental nature of that music distribution has changed. The winners and losers who bring that music to the masses have also changed. It will take time, but it will be faster than people realize.
Source: http://time.com/

Here’s What to Expect From the iPhone 7

A new dual camera system, a redesigned home button, and no headphone jack

Apple’s next iPhones will have three big physical changes from the current iPhone 6s and iPhone 6s Plus, according to a new report from Bloomberg’s Mark Gurman.
Those changes include a dual camera for taking sharper photos, a redesigned home button, and a slightly tweaked design that ditches the headphone jack as well as the antenna lines located on the back of the phones. Gurman, a former blogger with 9to5Mac who has a long history of accurately reporting details about unreleased Apple products, attributes the information to unnamed sources, including one person who claims to have used a prototype of the new phone.
The new dual camera system will reportedly produce brighter photos with more detail in addition to better preserving image quality while zooming. Both sensors snap a photo at the same time and then merge their results to produce one photo. This dual system is also said to make it easier to take photos in low light environments.
Some observers speculated that Apple would release an iPhone with a dual camera last year when the company acquired LinX, an Israeli startup that specializes in making sensors that capture multiple images at once.
The new home button, meanwhile, will be pressure sensitive and will produce haptic feedback rather than a clicking sensation, similar to the mousepad on Apple’s newer MacBook laptops, according to the report.
The two new phones, which Gurman says could debut as early as next month, will look very similar to the iPhone 6s and its larger sibling, save for two physical differences. Apple will reportedly remove the headphone jack from the newer models, as previous reports from The Wall Street Journaland others have indicated, allowing the company to build a second speaker into the new phones. Apple is also expected to get rid of the antenna lines that run across the back of the phones.

Apple typically revamps the design of its iPhones every two years. But its upcoming release will signal a shift from this schedule. Next year’s model, which will mark the iPhone’s 10-year anniversary, is expected to be a more significant change, according to Bloomberg.
This year’s iPhone launch will be especially important for Apple as investors look to see how the company will continue to innovate on its biggest moneymaker. Revenue from iPhone sales dropped by 23% in the company’s most recent quarter. The Cupertino, Calif.-based firm hasn’t spoken publicly about its upcoming products, but Apple CEO Tim Cook previously teased future iPhone devices when speaking with CNBC’s Jim Cramer.
“We’ve got great innovation in the pipeline from new iPhones that will incent you and other people that have iPhones today to upgrade to new iPhones,” said Cook. “We’re going to give you things that you can’t live without that you just don’t know that you need today.”
Source: http://time.com/

Monday, August 8, 2016

Scientists convert carbon dioxide, create electricity

While the human race will always leave its carbon footprint on the Earth, it must continue to find ways to lessen the impact of its fossil fuel consumption.
"Carbon capture" technologies -- chemically trapping carbon dioxide before it is released into the atmosphere -- is one approach. In a recent study, Cornell University researchers disclose a novel method for capturing the greenhouse gas and converting it to a useful product -- while producing electrical energy.
Lynden Archer, the James A. Friend Family Distinguished Professor of Engineering, and doctoral student Wajdi Al Sadat have developed an oxygen-assisted aluminum/carbon dioxide power cell that uses electrochemical reactions to both sequester the carbon dioxide and produce electricity.
Their paper, "The O2-assisted Al/CO2 electrochemical cell: A system for CO2capture/conversion and electric power generation," was published July 20 inScience Advances.
The group's proposed cell would use aluminum as the anode and mixed streams of carbon dioxide and oxygen as the active ingredients of the cathode. The electrochemical reactions between the anode and the cathode would sequester the carbon dioxide into carbon-rich compounds while also producing electricity and a valuable oxalate as a byproduct.
In most current carbon-capture models, the carbon is captured in fluids or solids, which are then heated or depressurized to release the carbon dioxide. The concentrated gas must then be compressed and transported to industries able to reuse it, or sequestered underground. The findings in the study represent a possible paradigm shift, Archer said.
"The fact that we've designed a carbon capture technology that also generates electricity is, in and of itself, important," he said. "One of the roadblocks to adopting current carbon dioxide capture technology in electric power plants is that the regeneration of the fluids used for capturing carbon dioxide utilize as much as 25 percent of the energy output of the plant. This seriously limits commercial viability of such technology. Additionally, the captured carbon dioxide must be transported to sites where it can be sequestered or reused, which requires new infrastructure."
The group reported that their electrochemical cell generated 13 ampere hours per gram of porous carbon (as the cathode) at a discharge potential of around 1.4 volts. The energy produced by the cell is comparable to that produced by the highest energy-density battery systems.
Another key aspect of their findings, Archer says, is in the generation of superoxide intermediates, which are formed when the dioxide is reduced at the cathode. The superoxide reacts with the normally inert carbon dioxide, forming a carbon-carbon oxalate that is widely used in many industries, including pharmaceutical, fiber and metal smelting.
"A process able to convert carbon dioxide into a more reactive molecule such as an oxalate that contains two carbons opens up a cascade of reaction processes that can be used to synthesize a variety of products," Archer said, noting that the configuration of the electrochemical cell will be dependent on the product one chooses to make from the oxalate.
Al Sadat, who worked on onboard carbon capture vehicles at Saudi Aramco, said this technology in not limited to power-plant applications. "It fits really well with onboard capture in vehicles," he said, "especially if you think of an internal combustion engine and an auxiliary system that relies on electrical power."
He said aluminum is the perfect anode for this cell, as it is plentiful, safer than other high-energy density metals and lower in cost than other potential materials (lithium, sodium) while having comparable energy density to lithium. He added that many aluminum plants are already incorporating some sort of power-generation facility into their operations, so this technology could assist in both power generation and reducing carbon emissions.
A current drawback of this technology is that the electrolyte -- the liquid connecting the anode to the cathode -- is extremely sensitive to water. Ongoing work is addressing the performance of electrochemical systems and the use of electrolytes that are less water-sensitive.

Story Source:
The above post is reprinted from materials provided by Cornell University. The original item was written by Melissa Osgood. Note: Content may be edited for style and length.

Journal Reference:
  1. W. I. Al Sadat, L. A. Archer. The O2-assisted Al/CO2 electrochemical cell: A system for CO2 capture/conversion and electric power generationScience Advances, 2016; 2 (7): e1600968 DOI:10.1126/sciadv.1600968

Sunday, August 7, 2016

Do black holes have a back door?

One of the biggest problems when studying black holes is that the laws of physics as we know them cease to apply in their deepest regions. Large quantities of matter and energy concentrate in an infinitely small space, the gravitational singularity, where space-time curves towards infinity and all matter is destroyed. Or is it? A recent study by researchers at the Institute of of Corpuscular Physics (IFIC, CSIC-UV) in Valencia suggests that matter might in fact survive its foray into these space objects and come out the other side.
Published in the journal Classical and Quantum Gravity, the Valencian physicists propose considering the singularity as if it were an imperfection in the geometric structure of space-time. And by doing so they resolve the problem of the infinite, space-deforming gravitational pull.
"Black holes are a theoretical laboratory for trying out new ideas about gravity," says Gonzalo Olmo, a Ramón y Cajal grant researcher at the Universitat de València (University of Valencia, UV). Alongside Diego Rubiera, from the University of Lisbon, and Antonio Sánchez, PhD student also at the UV, Olmo's research sees him analysing black holes using theories besides general relativity (GR).
Specifically, in this work he has applied geometric structures similar to those of a crystal or graphene layer, not typically used to describe black holes, since these geometries better match what happens inside a black hole: "Just as crystals have imperfections in their microscopic structure, the central region of a black hole can be interpreted as an anomaly in space-time, which requires new geometric elements in order to be able to describe them more precisely. We explored all possible options, taking inspiration from facts observed in nature."
Using these new geometries, the researchers obtained a description of black holes whereby the centre point becomes a very small spherical surface. This surface is interpreted as the existence of a wormhole within the black hole. "Our theory naturally resolves several problems in the interpretation of electrically-charged black holes," Olmo explains. "In the first instance we resolve the problem of the singularity, since there is a door at the centre of the black hole, the wormhole, through which space and time can continue."
This study is based on one of the simplest known types of black hole, rotationless and electrically-charged. The wormhole predicted by the equations is smaller than an atomic nucleus, but gets bigger the bigger the charge stored in the black hole. So, a hypothetical traveller entering a black hole of this kind would be stretched to the extreme, or "spaghettified," and would be able to enter the wormhole. Upon exiting they would be compacted back to their normal size.
Seen from outside, these forces of stretching and compaction would seem infinite, but the traveller himself, living it first-hand, would experience only extremely intense, and not infinite, forces. It is unlikely that the star of Interstellar would survive a journey like this, but the model proposed by IFIC researchers posits that matter would not be lost inside the singularity, but rather would be expelled out the other side through the wormhole at its centre to another region of the universe.
Another problem that this interpretation resolves, according to Olmo, is the need to use exotic energy sources to generate wormholes. In Einstein's theory of gravity, these "doors" only appear in the presence of matter with unusual properties (a negative energy pressure or density), something which has never been observed. "In our theory, the wormhole appears out of ordinary matter and energy, such as an electric field" (Olmo).
The interest in wormholes for theoretical physics goes beyond generating tunnels or doors in spacetime to connect two points in the Universe. They would also help explain phenomena such as quantum entanglement or the nature of elementary particles. Thanks to this new interpretation, the existence of these objects could be closer to science than fiction.

Story Source:
The above post is reprinted from materials provided by Asociación RUVID.Note: Content may be edited for style and length.

Friday, July 29, 2016

Human eye spots single photons

Human eyes are capable of detecting a single photon — the tiniest possible speck of light — new research suggests.
The result, published July 19 in Nature Communications, may settle the debate on the ultimate limit of the sensitivity of the human visual system, a puzzle scientists have pondered for decades. Scientists are now anticipating possibilities for using the human eye to test quantum mechanics with single photons.
Researchers also found that the human eye is more sensitive to single photons shortly after it has seen another photon. This was “an unexpected phenomenon that we just discovered when we analyzed the data,” says physicist Alipasha Vaziri of Rockefeller University in New York City.
Previous experiments have indicated that humans can see blips of light made up of just a few photons. But there hasn’t been a surefire test of single photons, which are challenging to produce reliably. Vaziri and colleagues used a quantum optics technique called spontaneous parametric down-conversion. In this process, a high-energy photon converts into two low-energy photons inside of a crystal. One of the resulting photons is sent to someone’s eye, and one to a detector, which confirms that the photons were produced.
During the experiment, subjects watched for the dim flash of a photon, which arrived at one of two times, with both times indicated by a beep. Subjects then chose which beep they thought was associated with a photon, and how confident they were in their decision.
In 2,420 trials, participants fared just slightly better than chance overall. That seemingly unimpressive success rate is expected. Because most photons don’t make it all the way through the eye to the retina where they can be seen, in most trials, the subject wouldn’t be able to see a photon associated with either beep. But in trials where the participants indicated they were most certain of their choice, they were correct 60 percent of the time. Such a success rate would be unlikely if humans were unable to see photons — the chance of such a fluke is 0.1 percent.
“It’s not surprising that the correctness of the result might rely on the confidence,” says physicist Paul Kwiat of the University of Illinois at Urbana-Champaign, who was not involved with the research. The high-confidence trials may represent photons that made it through to the retina, Kwiat suggests.
Additionally, the data indicate that single photons may be able to prime the brain to detect more dim flashes that follow. When participants had seen another photon in the preceding 10 seconds, they had better luck picking out the photon.
Scientists hope to use the technique to test whether humans can directly observe quantum weirdness. Photons can be in two places at once, a state known as a quantum superposition. The technique could be adapted to send such quantum states to a subject’s eye. But, says Leonid Krivitsky, a physicist at the Agency for Science, Technology and Research in Singapore, “I’m pretty skeptical about this idea of observing quantumness in the brain.” The signals, he suggests, will have lost their quantum properties by the time they reach the brain.
Whether humans can see individual photons may seem to be a purely academic question. But, Vaziri says, “If you are somewhere outside of a city in nature and on a moonless light and you have only stars to navigate, on average the number of photons that get into your eye is approaching the single photon regime.” So, he says, having eyes sensitive enough to see single photons may have some evolutionary advantage.

Source: https://www.sciencenews.org/

How dinosaurs hopped across an ocean

Two land bridges may have allowed dinosaurs to saunter between Europe and North America around 150 million years ago.
The bridges would explain how dinosaurs, mammals and other animals were able to hop from one continent to the other after the Atlantic Ocean formed during the breakup of the Pangaea supercontinent. Some species of Stegosaurus, for instance, appear in the fossil record on both sides of the Atlantic.
Leonidas Brikiatis, an independent biogeographer in Palaio Faliro, Greece, proposes that two strips of land bridged North America and Europe during the late Jurassic and early Cretaceous periods. One bridge spanned from eastern Canada to the Iberian Peninsula, where Spain is today, and lasted from around 154 million to 151 million years ago. The other linked North America and Scandinavia from around 131 million to 129 million years ago, Brikiatis reports in the August Earth-Science Reviews.
The routes allowed dinosaurs to “foil plate tectonics’ plan to break up the world,” says Paul Sereno, a vertebrate paleontologist at the University of Chicago who was not involved in the study, which reviewed recent studies of vertebrates in the fossil record appearing on opposite sides of the Atlantic. “A continent can’t contain a dinosaur; they’ll escape. This work highlights two of the routes they took.”
map of land bridge
A LINK BETWEEN WORLDS A land route may have connected eastern Canada and the Iberian Peninsula around 150 million years ago. That bridge would have allowed animals to hop between Europe and North America
L. BRIKIATIS/EARTH-SCIENCE REVIEWS 2016

Dinosaurs, including species of Supersaurusand Allosaurus,probably made the transatlantic trek alongside turtles, lizards and early mammals. While the Atlantic Ocean was narrower back then, it was probably too wide to swim across. Brikiatis used the dates of the relocations to establish a potential window of time when the bridges existed and considered potential crossings that might have existed at the time. The best contenders are patches of relatively shallow water called ocean shelves. Tectonic activity could have lifted these shelves above sea level, creating narrow strips of land around 80 to 160 kilometers across, Brikiatis says. Over time, the bridges may have sunk back below the sea.
Those land routes would have been somewhat similar to other ocean crossings, such as the Bering land bridge humans traversed around 23,000 years ago between Asia and North America (SN: 8/22/15, p. 6) and the modern Isthmus of Panama that links North and South America (SN: 5/2/15, p. 10).
The ancient bridge connecting North America and Scandinavia may have coexisted with another land route that connected Europe and what would later become Russia, allowing migrations across much of the world, Brikiatis proposes.
While the routes proposed in the work are plausible, the dates might be off, says Octávio Mateus, a paleontologist at the Universidade Nova de Lisboa in Caparica, Portugal. Species may have migrated earlier than evidenced in the fossil record, he says. “Just because you find them then doesn’t mean they came then. They could have come millions of years before, but just didn’t leave fossils.”
The bridges may also have been more like stepping stones than an unbroken migration highway, says vertebrate paleontologist Anne Schulp of the Naturalis Biodiversity Center in Leiden, the Netherlands. “A narrow body of water is not impenetrable,” he says. “You don’t need a full bridge.”

Source: https://www.sciencenews.org/

Monday, July 25, 2016

Scientists unlock 'green' energy from garden grass

Garden grass could become a source of cheap and clean renewable energy, scientists have claimed.
A team of UK researchers, including experts from Cardiff University's Cardiff Catalysis Institute, have shown that significant amounts of hydrogen can be unlocked from fescue grass with the help of sunlight and a cheap catalyst.
It is the first time that this method has been demonstrated and could potentially lead to a sustainable way of producing hydrogen, which has enormous potential in the renewable energy industry due to its high energy content and the fact that it does not release toxic or greenhouse gases when it is burnt.
Co-author of the study Professor Michael Bowker, from the Cardiff Catalysis Institute, said: "This really is a green source of energy.
"Hydrogen is seen as an important future energy carrier as the world moves from fossil fuels to renewable feedstocks, and our research has shown that even garden grass could be a good way of getting hold of it."
The team, which also includes researchers from Queen's University Belfast, have published their findings in the Royal Society journal Proceedings A.
Hydrogen is contained in enormous quantities all over in the world in water, hydrocarbons and other organic matter.
Up until now, the challenge for researchers has been devising ways of unlocking hydrogen from these sources in a cheap, efficient and sustainable way.
A promising source of hydrogen is the organic compound cellulose, which is a key component of plants and the most abundant biopolymer on Earth.
In their study, the team investigated the possibility of converting cellulose into hydrogen using sunlight and a simple catalyst -- a substance which speeds up a chemical reaction without getting used up.
This process is called photoreforming or photocatalysis and involves the sunlight activating the catalyst which then gets to work on converting cellulose and water into hydrogen.
The researchers studied the effectiveness of three metal-based catalysts -- Palladium, Gold and Nickel.
Nickel was of particular interest to the researchers, from a practical point of view, as it is a much more earth-abundant metal than the precious metals, and is more economical.
In the first round of experiments, the researchers combined the three catalysts with cellulose in a round bottom flask and subjected the mixture to light from a desk lamp. At 30 minutes intervals the researchers collected gas samples from the mixture and analysed it to see how much hydrogen was being produced.
To test the practical applications of this reaction, the researchers repeated the experiment with fescue grass, which was obtained from a domestic garden.
Professor Michael Bowker continued: "Up until recently, the production of hydrogen from cellulose by means of photocatalysis has not been extensively studied.
"Our results show that significant amounts of hydrogen can be produced using this method with the help of a bit of sunlight and a cheap catalyst.
"Furthermore, we've demonstrated the effectiveness of the process using real grass taken from a garden. To the best of our knowledge, this is the first time that this kind of raw biomass has been used to produce hydrogen in this way. This is significant as it avoids the need to separate and purify cellulose from a sample, which can be both arduous and costly."

Story Source:
The above post is reprinted from materials provided by Cardiff University.Note: Materials may be edited for content and length.

Saturday, July 16, 2016

New light harvesting potentials uncovered

Researchers for the first time have found a quantum-confined bandgap narrowing mechanism where UV absorption of the grapheme quantum dots and TiO2 nanoparticles can easily be extended into the visible light range.
Such a mechanism may allow the design of a new class of composite materials for light harvesting and optoelectronics.
Dr Qin Li, Associate Professor in the Environmental Engineering & Queensland Micro- and Nanotechnology Centre, says real life application of this would be high efficiency paintable solar cells and water purification using sun light.
"Wherever there is abundant sun we can brush on this nanomaterial to harvest solar energy to create clean water," she says.
"This mechanism can be extremely significant for light harvesting. What's more important is we've come up with an easy way to achieve that, to make a UV absorbing material to become a visible light absorber by narrowing the bandgap."
Visible light makes up 43 per cent of solar energy compared to only 5 per cent possessed by UV light.
Major efforts have been made to improve titania's absorption of visible light or develop visible-light sensitive materials in general.
Methods used for titania, including metal ion doping, carbon doping, nitrogen doping and hydrogenation usually require stringent conditions to obtain the modified TiO2 such as elevated temperature or high pressure.
In their innovative paper published in Chemical Communications, a Royal Society of Chemistry journal, the researchers observed that when TiO2 particles are mixed with graphene quantum dots, the resulting composite absorbs visible light by a quantum-confined bandgap narrowing mechanism.
"We were really excited to discover this: when two UV absorbing materials, namely TiO2 and graphene quantum dots, were mixed together, they started to absorb in the visible range, more significantly, the bandgap can be tuned by the size of graphene quantum dots," says Dr Li.
"We named the phenomenon 'quantum-confined bandgap narrowing' and this mechanism may be applicable to all semiconductors, when they are linked with graphene quantum dots. Flexible tuning of bandgap is extremely desirable in semiconductor-based devices."
This work has been selected to feature in the front inside cover of Chemical Communications. The team's work on the fluorescence mechanism of graphene quantum dots recently has also been featured in Nanoscale.

Story Source:
The above post is reprinted from materials provided by Griffith University.Note: Materials may be edited for content and length.

Journal Reference:
  1. Shujun Wang, Ivan S. Cole, Qin Li. Quantum-confined bandgap narrowing of TiO2nanoparticles by graphene quantum dots for visible-light-driven applicationsChem. Commun., 2016; 52 (59): 9208 DOI: 10.1039/C6CC03302D

Happy cows make more nutritious milk

Daily infusions with a chemical commonly associated with feelings of happiness were shown to increase calcium levels in the blood of Holstein cows and the milk of Jersey cows that had just given birth. The results, published in the Journal of Endocrinology, could lead to a better understanding of how to improve the health of dairy cows, and keep the milk flowing.
Demand is high for milk rich in calcium: there is more calcium in the human body than any other mineral, and in the West dairy products such as milk, cheese and yoghurt are primary sources of calcium. But this demand can take its toll on milk-producing cows: roughly 5-10% of the North American dairy cow population suffers from hypocalcaemia -- in which calcium levels are low. The risk of this disease is particularly high immediately before and after cows give birth.
Hypocalcaemia is considered a major health event in the life of a cow. It is associated with immunological and digestive problems, decreased pregnancy rates and longer intervals between pregnancies. These all pose a problem for dairy farmers, whose profitability depends upon regular pregnancies and a high-yield of calcium-rich milk.
Whilst there has been research into the treatment of hypocalcaemia, little research has focused on prevention. In rodents it has been shown that serotonin (a naturally-occurring chemical commonly associated with feelings of happiness) plays a role in maintaining calcium levels; based on this, a team from the University of Wisconsin-Madison, led by Dr Laura Hernandez, investigated the potential for serotonin to increase calcium levels in both the milk and blood of dairy cows. The team infused a chemical that converts to serotonin into 24 dairy cows, in the run up to giving birth. Half the cows were Jersey and half were Holstein -- two of the most common breeds. Calcium levels in both the milk and circulating blood were measured throughout the experiment.
Whilst serotonin improved the overall calcium status in both breeds, this was brought about in opposite ways. Treated Holstein cows had higher levels of calcium in their blood, but lower calcium in their milk (compared to controls). The reverse was true in treated Jersey cows and the higher milk calcium levels were particularly obvious in Jerseys at day 30 of lactation -- suggesting a role for serotonin in maintaining levels throughout lactation.
"By studying two breeds we were able to see that regulation of calcium levels is different between the two," says Laura Hernandez. "Serotonin raised blood calcium in the Holsteins, and milk calcium in the Jerseys. We should also note that serotonin treatment had no effect on milk yield, feed intake or on levels of hormones required for lactation."
The next steps are to investigate the molecular mechanism by which serotonin regulates calcium levels, and how this varies between breeds.
"We would also like to work on the possibility of using serotonin as a preventative measure for hypocalcaemia in dairy cows," continues Laura Hernandez, "That would allow dairy farmers to maintain the profitability of their businesses, whilst making sure their cows stay healthy and produce nutritious milk."

Story Source:
The above post is reprinted from materials provided by European Society of EndocrinologyNote: Materials may be edited for content and length.

Journal Reference:
  1. Samantha R Weaver, Austin P Prichard, Elizabeth L Endres, Stefanie A Newhouse, Tonia L Peters, Peter M Crump, Matthew S Akins, Thomas D Crenshaw, Rupert M Bruckmaier, Laura L Hernandez. Elevation of circulating serotonin improves calcium dynamics in the peripartum dairy cowJournal of Endocrinology, 2016; 230 (1): 105 DOI:10.1530/JOE-16-0038