Showing posts with label News. Show all posts
Showing posts with label News. Show all posts

Tuesday, 16 December 2014

Imagination Technologies

By Unknown | At 7:19:00 am | Label : , | 0 Comments

Imagination takes open and accessible micro-computing to the next level

Introduces MIPS Creator CI20 high-performance, low power development board with advanced graphics, video, audio and a range of connectivity options for IoT applications

Imagination Technologies
London, UK – 4th December, 2014 – Imagination Technologies (IMG.L) announces that the high-performance, low power MIPS Creator CI20 development board is now available for pre-order.

The MIPS Creator CI20 is an affordable micro-computer that runs Linux and Android and enables open source developers, the maker community, system integrators and others to create a wide range of applications and projects, ranging from home automation and gaming to wireless multimedia streaming.

Originally introduced in August 2014 through a limited promotional giveaway for university students, developers, hobbyists and partners, Imagination was inundated with enquiries leading to today’s announcement that the Creator CI20 is available to pre-order at $65/ £50, with units available at the end of January 2015.

Says Tony King-Smith, EVP of marketing for Imagination: “We are very excited to be taking part in the growing interest within the open source and maker communities for affordable, fully featured development platforms. Creator CI20 has been designed for people who want high performance and advanced features for their development projects and to create access at the software and hardware level to allow creativity to come to the fore.”

The development board includes a unique combination of features and specifications:

-       A 1.2 GHz MIPS32-based, dual-core processor designed for superior performance and low power computing

-       PowerVR SGX540 graphics offering full support for OpenGL 2.1 and OpenGL ES 2.0

-       Dedicated video hardware for low power 1080p decoding at 30 fps

-       A full package of connectivity options including fast Ethernet, high-speed 802.11 b/g/n Wi-Fi and Bluetooth 4.0

-       On-board memory:1GB DDR3 memory, 4GB flash memory and an SD card expansion slot

-       High-quality audio playback

-       A comprehensive set of peripherals, including GPIO, SPI, I2C, UART, etc.

MIPS Creator CI20 comes pre-installed with Debian 7. Other Linux distributions will be available, including Gentoo and Yocto, as well as a version of Debian 7 that complies with the principles of the Free Software Foundation. Users can also install the latest version of Android v4.4.

Creator CI20 also includes FlowCloud support for application development focused on the Internet of Things (IoT). FlowCloud is an exciting IoT platform from Imagination that enables users to construct solutions for a wide range of applications, including security, personal and professional health monitoring, energy management, cloud-based systems for content delivery and much more.

Imagination community quotes

Says Mike Woster, COO and VP of business development for The Linux Foundation: “We're excited to see Linux Foundation members like Imagination Technologies contribute to the choice, affordability, power and variety of Linux development boards available today. Whether as a DIY or design platform, work like this is sure to attract more developers and help kick off new applications.”

Says Art Swift, president of the prpl Foundation: “The prpl Foundation and our member companies welcome the highly capable Creator CI20 development board.  Developers in the open source community will appreciate the high performance, advanced feature set, and affordable cost of this microcomputer.  It is a great example of the sophisticated hardware designs offered by the vibrant and growing MIPS ecosystem.”

Says Neil Henning, technology lead for Codeplay: “The Creator CI20 is a fast development board for a really good price that combines two technologies we are very excited about: a dual-core MIPS CPU and a PowerVR GPU. There is a real drive for more and more affordable devices like this that deliver power efficiency yet still pack a hefty punch. The new Creator CI20 micro-computer has filled the role perfectly.”

Says Yingtao Hou, VP of Chukong Technology: “Imagination provides the industry-leading PowerVR graphics processors inside many iconic smartphones and tablets. We’ve worked with them extensively on performance optimizations for games built using the Cocos2d-x engine and have been very satisfied with the tools and utilities they provide for Android or Linux. We welcome the launch of the MIPS Creator CI20 micro-computer and look forward to explore the possibilities it brings.”  

Pricing and Availability

Priced at $65 (US) / £50 (Europe), the MIPS Creator CI20 is available for pre-order now from http://store.imgtec.com/ for North America and Europe, with more countries to be added shortly.

Units will be despatched at the end of January 2015.

Sunday, 23 November 2014

Tech News

By Unknown | At 4:14:00 am | Label : , | 0 Comments

IBM and NVIDIA Will Power the World Fastest Supercomputer

NVidia and IBMIBM (NYSE: IBM  ) announced earlier this month that it had secured two contracts from the U.S. Department of Energy, worth a total of $325 million, to build two new supercomputers at the Oak Ridge and Lawrence Livermore national laboratories. Both supercomputers will be more powerful than the fastest system in operation today, and this deal marks the first big win for IBM OpenPOWER initiative.
NVIDIA (NASDAQ: NVDA  ) , also a member of OpenPOWER, will supply its Tesla GPU to accelerate both new super computers, along with NVLink, a technology that greatly increases the rate at which data can be transferred from CPU to GPU. The supercomputer being replaced by the faster of the two new systems, Titan, was NVIDIA first big win in the supercomputer space, and it helped begin the era of conducting scientific calculations on the GPU.

Dealing with Big Data
Both supercomputers will handle Big Data problems, which involve crunching enormous amounts of data, and IBM has optimized the architecture underlying the systems specifically for this task. Simply increasing the performance of the CPU and GPU is not enough, because much of the work being done when Big Data is involved is simply moving data back and forth between storage and the processors. As IBM stated in its press release, "The popular practice of putting design emphasis solely on faster microprocessors becomes progressively more untenable because the computing infrastructure is dominated by data movement and data management."

This is a dig at Intel (NASDAQ: INTC  ) , a company with a near monopoly in the server CPU market, and the main target of IBM OpenPOWER initiative. IBM power systems have been losing market share for years to x86 based systems, mostly powered by Intel chips, and IBM decision to open up the architecture is seemingly starting to pay off.

One reason IBM won these contracts is energy efficiency. Supercomputers consume monstrous amounts of power, with the fastest supercomputer in the world, the Tianhe 2 in China, using about 17.8 megawatts. To put this in perspective, that's enough to power a little more than 14,000 average homes in the United States.

Titan, the system Oak Ridge is replacing, uses 8.2 MW of power. Summit, the larger of the two new supercomputers, will use about 10 MW of power, but it will provide five to 10 times the performance of Titan. This increased energy efficiency is due to efficiency gains in NVIDIA GPU and the minimization of moving data around the system.

What it means for IBM

IBM Power architecture gets a big win with these systems. While Intel is unlikely to give up its massive market share in the server CPU market, the subset of that market that deals with Big Data could be up for grabs. IDC predicts the Big Data technology and services market is going to grow at a 27% compound annual rate through 2017, turning into a $32 billion market by that time, and IBM has a real chance at claiming a significant portion.

The OpenPOWER initiative makes this more likely, as it allows other hardware and software companies to develop products that are tightly integrated with the Power architecture. There a big incentive for many companies to create a viable alternative to Intel based systems; Google, for example, sends plenty of money Intel way each year in building out its cloud data centers. With Intel having a near monopoly, the company can charge extremely high prices.

The supercomputer deal is an important accomplishment for IBM, and it provides a high profile example of the benefits of OpenPOWER.

What it means for NVIDIA

NVIDIA has become the leader in the high performance computing accelerator card market, and while the enterprise division is still a small part of its overall business, it has been growing fast. From fiscal 2010 to 2014, NVIDIA high performance computing and data center revenue has grown at a 64% compound annual rate.

NVIDIA GPU are in 17 of the top 100 supercomputers in the world, and that number will likely grow as GPU become more common in supercomputing. NVIDIA biggest competitor is Intel, which has its own line of accelerator cards, the Xeon Phi, but so far NVIDIA has maintained its lead, with the Phi present in only 11 of the top 100 supercomputers.

Intel certainly is not standing still, with the company set to launch an updated Phi in the second half of 2015. But one of NVIDIA strengths is the vast catalog of software that has been adapted to run on its GPU, such as scientific simulations, business analytics, and 3 D modeling programs. This creates lock in and switching costs, and it makes it more difficult for Intel to gain traction in the market.

A single company is unlikely to battle Intel in the enterprise market, but with the companies involved in the OpenPOWER initiative working together, Intel has much to worry about. The supercomputer deal is the first piece of evidence that Intel has a real competitor on its hands for the first time in years, at least when it comes to Big Data.

$19 trillion industry could destroy the Internet
One bleeding-edge technology is about to put the World Wide Web to bed. It could make early investors wildly rich. Experts are calling it the single largest business opportunity in the history of capitalism. The Economist is calling it "transformative"... but you'll probably just call it "how I made my millions." Don't be too late to the party -- click here for 1 stock to own when the Web goes dark.

Timothy Green owns shares of International Business Machines and Nvidia. The Motley Fool recommends Google (A shares), Google (C shares), Intel, and Nvidia. The Motley Fool owns shares of Google (A shares), Google (C shares), Intel, and International Business Machines. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.

Wednesday, 19 November 2014

ONC plots quality improvement plan

By Unknown | At 3:07:00 am | Label : , | 0 Comments
Health Information Technology
ONC sculpted a decade long vision for ensuring Health Information Technology is geared toward what it described as "robust and continuous quality improvement."

"Dramatic advancements have been made in digitizing the care delivery system during the past decade," ONC notes in its report  not least the fact that all 50 states have some form of health information exchange services to enable care coordination.

In addition, more than half of U.S. hospitals can electronically search for patient information outside their own walls, with six out of ten electronically exchanging health information with outside providers.

Still, more can be done.


"ONC envisions an electronically enabled QI ecosystem that promotes better health and care, improved communication and transparency, rapid translation of knowledge for all stakeholders and reduction in the burden of data collection and reporting for providers," according to the study.

Toward that end, the agency has set goal posts at 3, 6 and 10 year intervals.

3 years out "Alignment and standarisation to support data capture within the QI ecosystem"

Quality reporting programs must be better aligned, "to reduce the collection and reporting burden on providers and hospitals," ONC concedes.

"Through maturing standards, giving technical assistance, certifying Health Information Technology and coordination among federal and state agencies, ONC will work to harmonize and align measure components, tools and standards," according to the report. "Providers, payers and health systems need highly reliable, comparable and universally accepted performance indicators for priority health conditions and patient safety initiatives, such as preventable hospital readmissions and reduction of health care associated conditions."

Moreover, "stakeholders may have unique QI objectives on which they would like to focus," says ONC. "A key building block to enabling this wide variety of QI goals is the capture of highly structured, shareable data that has the appropriate level of metadata in place to support multiple uses CDS, advanced analytics and quality measurement."

6 years out "Big data for the QI ecosystem"


Over the next six years, as quality improvement data sharing increases. "quality and safety metrics will refocus from provider centric to patient centric," according to ONC. "The data in Health Information Technology, including patient generated and claims data, will be standardized, linked at the individual level to clinical data as appropriate, and optimized for interoperable sharing and aggregation.

"Clinical data from heath IT will be increasingly structured, but will still require mapping and normalization to be aggregated and analyzed," according to the report. "Big data has the characteristics of high volume, high velocity and high variety and will require effective and innovative analytic tools to filter out the 'noise' and yield useful information.

"For the purposes of performance measurement and improvement for value based payment, regional linking of claims and clinical data will enable measurement and reporting on quality and efficiency to all payers while providing timely feedback to providers on all their patients."

10 years out: "Fast data, fast improvement across the QI ecosystem"


A decade from now, ONC sees a rosy future for interoperability and data exchange, leading to markedly better care: "By 2024, the nationwide use of interoperable Health Information Technology will be pervasive. Patients and their care team will use quality and safety data and measurement as an expected aspect of care delivery. There will be numerous advancements in the features, functionality and interoperability of Health Information Technology tools that the QI ecosystem stakeholders seamlessly interact with on a daily basis for multiple purposes such as: healthy habits of daily living, delivery of care, care coordination, population management and value based reimbursement.

"Our citizens will enjoy better health, high value and high quality care, with improved safety and highly usable technologies and resources," ONC report continues. "Individuals will view themselves as the hub of their health and care and will be considered an integral member of the care team. Technology will continue to advance and become more pervasive in every aspect of daily living. Individuals will routinely use advanced technology to manage and monitor their wellness and health care, and generate data for use by multiple IT systems and analytical tools.

"Their data will be available whenever and wherever they are needed to enable optimal health and care. They will enjoy personalized information and individualized care which is crucial to facilitate their wellness goals and the flow of their health information."

Tuesday, 18 November 2014

Retailers Will Love the Apple Pay Era

By Unknown | At 3:46:00 am | Label : , , | 0 Comments

Apple Pay Era

Apple Pay EraRetailers should be particularly excited for Monday debut of Apple Pay, which promises to be an excellent tool for separating shoppers and their money.

Apple (AAPL) mobile payment service will let iPhone users buy things by simply pulling out their device. Researchers have long found that shoppers spend more the further they get from handling actual currency and tend to better remember cash transactions. These tendencies help explain why credit card balances tend to bloat and why casinos use chips in place of money. It’s also why companies such as Starbucks (SBUX) encourage customers to load money onto apps or prepaid cards.

Behavioral economists have a term for this dynamic: decoupling. The card or app or casino chip mentally separates the consumer from his bank account. The payment is both delayed and bundled with other charges so it does not seem so painful. Citibank tested the research in 2009 and found a mobile “tap to pay” pilot program significantly boosted both the number and size of consumer transactions.
Buying things without cash is simply more fun. Richard Thaler, a behavioral economist at the University of Chicago, proved such transactions are more pleasurable experiences (PDF). Anyone who has ever walked away from an Uber ride knows this feeling well. With credit card data embedded in the apps settings, someone using the service never actually pays or even tips at least not in any physical way.

The question with Apple new payment service is whether it’s an additional degree of distance from credit cards or merely taking the well established place of plastic in the psychology of shopping. Apple Pay does not require any swiping or tapping, which seems to suggest a new level of abstraction. With a fingerprint on the iPhone button and a little wave at the cash register, the deal is done. “Now paying in stores happens in one natural motion,” Apple says in its pitch. “You don’t even have to look at the screen.”

Some 220,000 stores are already set up to accept the payments, including Bloomingdale, Foot Locker, Macy, McDonald, and PetSmart. The list also includes RadioShack, a retailer desperately in need of a revenue boost. The other brick and mortar companies in that troubled camp J.C. Penney (JCP), Sears (SHLD) would do well to get onboard.

Monday, 17 November 2014

New Technology Transforms Wood Waste Into High Value Hardwood

By Unknown | At 2:12:00 pm | Label : , , | 0 Comments

Wood Waste Into High Value Hardwood

Wood ProductionWood is a hard, fibrous structural tissue found in the stems and roots of trees and other woody plants. It has been used for thousands of years for both fuel and as a construction material. It is an organic material, a natural composite of cellulose fibbers (which are strong in tension) embedded in a matrix of lignin which resists compression. Wood is sometimes defined as only the secondary xylem in the stems of trees, or it is defined more broadly to include the same type of tissue elsewhere such as in the roots of trees or shrubs. In a living tree it performs a support function, enabling woody plants to grow large or to stand up by themselves. It also mediates the transfer of water and nutrients to the leaves and other growing tissues. Wood may also refer to other plant materials with comparable properties, and to material engineered from wood, or wood chips or fibber.

The Earth contains about one trillion tonnes of wood, which grows at a rate of 10 billion tonnes per year. As an abundant, carbon neutral renewable resource, woody materials have been of intense interest as a source of renewable energy. In 1991, approximately 3.5 cubic kilometres of wood were harvested. Dominant uses were for furniture and building construction.

Sawmills will soon be able to turn wood waste into high value hardwood products, helping them increase profits at a difficult time.

Company 3RT has partnered with Flinders University to develop a machine that cuts wood offcuts or softwood into strips, sticks them together, and presses them into blocks.

The aim is to increase hardwood supply sustainably, and create market opportunities for mills, that have been struggling with loss of access to native forests, labour shortages, and competition with cheaper imports.
3RT managing director Peter Torreele says the technology is in demand by the industry, and offers more sustainable and cheaper wood production.

"The problem that people do not buy hardwood today, or not enough, is firstly, it's becoming more and more scarce to get it, and secondly, it's very expensive, so we are basically producing the same type of hardwood for a third of the price."

He says the manufacturing unit, the first of which will be built in South Australia early next year, will cost between $3 million and $5 million.

There is also nanotechnology available to make the wood resistant to things like fire.

Now Mr Torreele is contacting mills to gauge interest and individual opportunities.

Timber Queensland welcomes the technology, stating sawmills have been looking for ways to utilise their unused wood fibre.

Wednesday, 12 November 2014

Latest Technology News

By Unknown | At 5:32:00 pm | Label : , | 0 Comments

The contest  hosted by the popular maker website

BoflakeInstructables  is an opportunity for DIYers and students from around the United States to share their 3D designs with each other and with the first family.

Contestants are asked to create an ornament that reflects "the theme of the magic and wonder of the holidays and the winter season," according to the contest rules. The 10 Weirdest Things Created By 3D Printing

Eight lucky finalists will have their designs printed and di Eight lucky finalists will have their designs printed and displayed in the East Wing of the White House during the 2014 holiday season. The winning ornaments will also be featured as a part of the Smithsonian's 3D modeling project, X 3D, which seeks to digitalize the museum's extensive collection of artifacts to make it more widely available to the public.

The winning designs will even join a select group of White House ornaments showcased in the Smithsonian National Museum of American History's political history division.

All submissions for the Ornament Challenge will be reviewed by two separate panels of judges the first panel will choose 20 finalists from all of the submitted designs, and the second will select eight of those designs to be 3D-printed and displayed in the White House. Registered Instructables users can also vote for the designs they want to see hanging at 1600 Pennsylvania Ave, though public voting won't be used to make any final decisions about the winners.

So far, contestants have come up with an array of designs, some of which capture the whimsical wintery theme of the contest better than others. There's a lacey-looking ice skate that promises to win the judge attention, as well as a beautiful homage to the traditional presidential holiday speeches of years past. And then there's an ornament shaped like the disembodied head of Abraham Lincoln, and one shaped like a squid.

But the 3D-Printed Ornament Challenge is more about inspiring Americans to make thingsthan it is about winning, according to contest organizers. The competition is part of the White House's effort to get the so-called Maker Movement flowing in full swing in the U.S., an effort that began with the first ever White House Maker Faire in June.

There are only a few hours left to submit your design for this year's contest. To do so, visit the contest's home page on Instructables and upload your design as a new project in the Ornament Challenge. Instructables users can continue to vote on their favorite designs
Calling all 3D printing enthusiasts! the first ever White House 3D Printed Ornament Challenge.

New Technology

By Unknown | At 1:00:00 pm | Label : , | 0 Comments

The latest technology

New Technology
that we describe is related to brain. Neuroscientists have made remarkable progress in recent years toward understanding how the brain works. And in coming years, Europe’s Human Brain Project will attempt to create a computational simulation of the human brain, while the U.S. BRAIN Initiative will try to create a wide ranging picture of brain activity. These ambitious projects will greatly benefit from a new resource detailed and comprehensive maps of the brain’s structure and its different regions.

As part of the Human Brain Project, an international team of researchers led by German and Canadian scientists has produced a three-dimensional atlas of the brain that has 50 times the resolution of previous such maps. The atlas, which took a decade to complete, required slicing a brain into thousands of thin sections and digitally stitching them back together with the help of supercomputers. Able to show details as small as 20 micrometers, roughly the size of many human cells, it is a major step forward in understanding the brain’s three-dimensional anatomy.

To guide the brain’s digital reconstruction, researchers led by Katrin Amunts at the Julich Research Centre in Germany initially used an MRI machine to image the postmortem brain of a 65 year old woman. The brain was then cut into ultrathin slices. The scientists stained the sections and then imaged them one by one on a flatbed scanner. Alan Evans and his coworkers at the Montreal Neurological Institute organized the 7,404 resulting images into a data set about a terabyte in size. Slicing had bent, ripped, and torn the tissue, so Evans had to correct these defects in the images. He also aligned each one to its original position in the brain. The result is mesmerizing: a brain model that you can swim through, zooming in or out to see the arrangement of cells and tissues.

At the start of the 20th century, a German neuroanatomist named Korbinian Brodmann parceled the human cortex into nearly 50 different areas by looking at the structure and organization of sections of brain under a microscope. “That has been pretty much the reference framework that we’ve used for 100 years,” Evans says. Now he and his coworkers are redoing ­Brodmann’s work as they map the borders between brain regions. The result may show something more like 100 to 200 distinct areas, providing scientists with a far more accurate road map for studying the brain’s different functions.

“We would like to have in the future a reference brain that shows true cellular resolution,” says Amunts about one or two micrometers, as opposed to 20. That’s a daunting goal, for several reasons. One is computational: Evans says such a map of the brain might contain several petabytes of data, which computers today can’t easily navigate in real time, though he’s optimistic that they will be able to in the future. Another problem is physical: a brain can be sliced only so thin.

Advances could come from new techniques that allow scientists to see the arrangement of cells and nerve fibers inside intact brain tissue at very high resolution. Amunts is developing one such technique, which uses polarized light to reconstruct three-­dimensional structures of nerve fibers in brain tissue. And a technique called Clarity, developed in the lab of Karl Deisseroth, a neuroscientist and bioengineer at Stanford University, allows scientists to directly see the structures of neurons and circuitry in an intact brain. The brain, like any other tissue, is usually opaque because the fats in its cells block light. Clarity melts the lipids away, replacing them with a gel-like substance that leaves other structures intact and visible. Though Clarity can be used on a whole mouse brain, the human brain is too big to be studied fully intact with the existing version of the technology. But Deisseroth says the technique can already be used on blocks of human brain tissue thousands of times larger than a thin brain section, making 3-D reconstruction easier and less error prone. And Evans says that while Clarity and polarized-light imaging currently give fantastic resolution to pieces of brain, “in the future we hope that this can be expanded to include a whole human brain.”
Posting Lama ►
 

Pembaca

Followers

Copyright © 2015. Master Technology - All Rights Reserved My Template by Yossh