Dog Brothers Public Forum
Return To Homepage
October 04, 2015, 08:24:25 PM
Login with username, password and session length
Welcome to the Dog Brothers Public Forum.
Dog Brothers Public Forum
Politics, Religion, Science, Culture and Humanities
Science, Culture, & Humanities
Technology (nano, 3D, robots, etc)
Topic: Technology (nano, 3D, robots, etc) (Read 9895 times)
Technology (nano, 3D, robots, etc)
August 23, 2007, 08:39:51 PM »
Some of us here may remember our introduction to Nanotechnology via the Gilder Technology Report. I don't know if Stratfor is going too far afield from its expertise with this piece, but the subject is fascinating, complex, and important-- and so I open this thread.
Nanotechnology and the Regulation of New Technologies
Researchers from the Woods Hole Oceanographic Institution and Massachusetts Institute of Technology on Aug. 16 released a study stating that the production of carbon nanotubes gives rise to the creation of a slew of dangerous chemicals known as polycyclic aromatic hydrocarbons, including some that are toxic.
Discussion of a new regulatory regime for nanotechnology has been ongoing among think tanks, advocacy groups and industry for years, and findings that suggest the sector could generate public health risks will add to the growing pressure on regulators or legislators to decide how to regulate it.
The debate over the regulation of nanotechnology has taken place on two levels. The first is over the public health risks nanotechnology poses and ways to determine and measure those risks. This is mainly the familiar risk-assessment process applied to the products of a technology that acts slightly differently than previous technologies do.
At the center of a second debate over public policies governing nanotechnology is an older, more contentious issue: the politicization of science and technology. At issue is the point at which government is justified in stepping into the realm of science to stop or slow scientific research, regardless of whether harm has been done. This concern lay at the center of the early debate over biotechnology, and also played a role in the debate over federal funding of stem cells and bans on human cloning.
A number of efforts are currently under way to determine the answers to the first question. The most impressive of these efforts are occurring in a number of partnerships between corporations and advocacy groups or think tanks. By contrast, the debate over the second question is largely being ignored. Where it is taking place, the discussion is occurring by implication.
What ultimately happens with the risk-centered regulatory debate will impact this larger philosophical debate, and will be crucial to the rules governing the coming wave of new technologies. This new wave will include even more controversial issues, including human cloning and synthetic forms of life. These issues will challenge the public to accommodate technological progress in their world views.
Nanotechnology was defined by one of its founders, Nobel Prize winner Rick Smalley, as "the art and the science of building stuff that does stuff on a nanometer scale." Essentially, nanotechnology is the manipulation of atoms and small molecules at a level that is slightly different from chemistry. While nanoparticles generally behave like traditional chemicals do, in some cases they can be very different. In these slight differences lies the technology's promise -- namely, what is possible through chemistry has been studied for centuries, while nanotechnology mostly remains an open field. Still, as one observer has put it, to say that we should regulate nanotechnology is the equivalent of saying we should regulate a hammer -- nanotechnology is a tool, and its creations will emerge as the subject of regulatory debate.
Nanotechnology is currently used in commercial applications, most famously sunscreens and stain-resistant pants. The next five years will see a boom in the use of nanotechnology in applications ranging from greatly improved batteries to stronger, lighter materials to improved military weapons. At the base of nanotechnology are some prevalent building blocks, most importantly carbon nanotubes, fullerenes and buckminsterfullerenes or "buckyballs." (Fullerenes and buckyballs were named after Buckminster Fuller, considered the godfather of nanotechnology, because their shape is similar to his geodesic dome.)
The major players in nanotechnology include all of the large research-based chemistry companies, including DuPont, Dow Chemical Co., Corning Inc., General Electric Co. and a number of smaller research companies that cluster around universities in the Northeastern United States. The way these companies currently use nanotechnology has given rise to the first set of regulatory concerns surrounding nanotech. The questions raised by this use will be answered by rules regarding what these manufacturers must guard against in production, use and disposal of nanotechnologies. In June, DuPont and the environmental group Environmental Defense provided a preview of the likely framework for nanotechnology regulation.
Most of the larger corporate players view nanotechnology as an important addition to a new generation of chemistry and to biotechnology. It is in the combination of chemistry, biotechnology, electronics and nanotechnology -- specifically the combining of nanoscale devices with specially engineered living organisms -- that a real revolution in materials, devices and medicine lies. It is also here that the controversy surrounding nanotechnology is strongest, as it raises questions about the foreseeablity of risks and the desirability of certain technological advances.
When to Regulate?
Modern chemistry is regulated in industrialized countries by a process known as risk assessment, which is a complex scientific assessment that determines whether the potential risks posed to health and the environment of a certain chemical outweighs its value in commerce. In the United States, chemicals are regulated by the 1976 Toxic Substances Control Act (TSCA). In Europe, they are regulated by a new process known as the Registration, Evaluation and Authorization of Chemicals (REACH).
As the framework created by Environmental Defense and DuPont shows, nanotechnology probably can piggyback on chemical regulation, but it will require a slightly different set of standards than chemical regulations do. Important differences include measuring exposure and dose-response relationships. For example, Andrew Maynard of the Woodrow Wilson International Center for Scholars points out that for some nanoscale materials, such as titanium dioxide, toxicity is based on the surface area to which sensitive tissue -- lung tissue in the case of titanium dioxide -- is exposed, rather than simply the mass of the material. The dose still makes the poison, but the dose needs to be measured differently than in traditional chemistry. In addition, the current regulatory framework needs additional tools to anticipate harm, a controversial but largely successful element of chemical regulation far more difficult to apply in the new field of nanotechnology.
These regulatory questions have come at an interesting point. The European Union is only now beginning to implement REACH, and its coming into force has triggered changes in the marketplace and accelerated efforts to change U.S. chemical regulation. For some in the United States, the imminent commercial boom in nanotechnology calls for the widening of TSCA to cover nanotechnology.
Many see REACH as more protective of public health and the environment than TSCA. As such, there is a growing movement in the United States for the adoption of REACH-like chemical regulations. For those calling for a complete reassessment of TSCA, the revolution in nanotechnology has come at the right time. They argue that TSCA cannot cope with the challenges of nanotechnology, so therefore the law should be revamped to prepare for the next wave of technology. A number of states are currently considering their own REACH-like laws, and the "opening" of TSCA (Capitol Hill-speak for rewriting the law) seems increasingly likely in the coming years.
Politics and Technology
Ultimately, REACH and REACH-like laws deal only with the risks posed by the substance. They do not address the moral or social questions relating to whether society wants certain technologies to advance, or even whether the government has a right to stop the development of new technologies.
In the Western conception -- strongest in the United States -- individuals, groups and companies are allowed to do whatever they want until or unless that activity is proven harmful to others. Attendant social, cultural or economic changes have seldom been allowed to stand as a reason not to allow a technology. The classic example is the fate of the buggy whip manufacturer of the early 20th century driven out of business by the advent of the automobile. The manufacturers certainly experienced economic losses, but this cost was accepted as the price of technological advance. Similarly, the manufacturers of black-and-white televisions, vacuum tube amplifiers and film all have seen their businesses decimated by technological advances.
Still, the introduction of biotechnology to Europe sparked a protectionist reaction. The food that has been served to millions of Americans daily without incident was made, and largely remains, illegal for European consumers. Europeans have justified their bans on biotechnology using various scientific and ecological arguments, but with a few exceptions, their assertions are considered scientifically tenuous. This is not to say justifiable reasons for Europe to ban genetically modified organisms (GMOs) do not exist, just that the reasons the European Union has given for bolstering their laws are flimsy by almost any scientific account.
Instead, Europe approached biotechnology by banning products on social and cultural grounds. To do this, they appealed to the precautionary principle, which more or less states that in the presence of fear but the absence of hard data, a product should be proven not to be harmful before being allowed on the market. With the act of proving a negative still being impossible, when the principle is used in a regulatory context, it becomes a tool to ban a product or activity without proof that the thing is actually dangerous -- a clear reversal of the traditional process of letting people and companies do what they want to do as long as it harms no one.
The European Union saw biotechnology as bringing change to the economics of farming, reducing the margins for farmers, encouraging larger, corporate-owned farms and placing multinational seed companies that double as chemical companies in a powerful position on the farm. Such a shift was unacceptable to many EU countries, especially France. Making matters worse, the biotechnology companies argued that their products were materially no different than traditional products and should not be labeled as being different in any way. To Europeans (and also to the Japanese), bringing technology to food is suspicious to begin with. And saying it should not be labeled is akin to demanding the ability to foist a technology in a very personal place -- food -- on a helpless public. The EU bans on GMOs came for these reasons.
Products and Morality
World Trade Organization (WTO) rules contain prohibitions against the use of safety or health regulations as barriers to free trade. Under WTO rules, to avoid claims that product bans or prohibitions are protectionist, countries' regulations (or those of groupings like the EU) must reflect the standards set by the International Organization for Standardization (for products) or Codex Alimentarius (for food). Stricter standards can be judged to be trade barriers rather than legitimate protective regulations. While fighting in venues like WTO and Codex on behalf of the precautionary principle -- arguing that it is only sane to look before you leap and better to be safe than sorry -- the European Union has been forced to develop scientific arguments that meet the WTO's requirements. These have failed generally, and the union is under sanction for these regulations.
Nanotechnology (along with the coming combination of nanotech with other new technologies) has the potential to bring the precautionary principle back in a new, more coherent form. This would be marked by the public, regulators and legislators arguing over whether advances in science and technology should be political, rather than scientific.
American business expresses exasperation at the European Union's use of the precautionary principle, the bans on GMOs, hormone-fed beef and certain other products, and other such issues. At the same time, the United States has a number of regulations and policies applying the brakes to technology that do not solely rely on risk assessment and the assertion that the individual or corporate behavior is risky to others. The ban on human cloning and the federal government's decision not to fund stem cell research are examples of U.S. government decisions that certain technology is not desirable, regardless of the long-term potential benefit to society and assertions that by law these practices do no harm and therefore should be legal.
Nanotechnology in most applications does not rise to the level of controversy associated with human cloning or even stem cell research, but in some envisioned applications it does raise serious moral questions, especially when tied to emerging biotechnologies. Among the most intriguing of these is the development of synthetic life. A recent patent application was submitted for an organism composed of cells whose genetic makeup has been limited only to the genes necessary to maintain life. These synthetic organisms, combined with nanotechnologies that can provide structure and even potentially movement, create essentially programmable living things. The applications for medicine, remediation and manufacturing are legion. The moral questions to some are just as vast. In an attempt to raise concerns, one activist group has nicknamed the patented synthetic organism "Synthia."
Stopping Synthia's creation could prove difficult. Its creation, life and disposal will not hurt anyone. Like Dolly and the dozens of cloned animals that came after, it is not human. Those who want to stop Synthia's creators can argue they do not want this technology to advance, but in the strictest regulatory sense, what is happening is legal. Still, there probably will be potent debates in Washington, Brussels and other capitals over the limits society wants placed on biotechnology and nanotechnology, and politics will be playing a role in the future of technology. The question facing nanotechnology's champions -- both in the short term and in the long term -- will be whether they want to press this to a crisis and force regulators to draw a line defining where politics does and does not have a place in technology, or whether they want to stay clear of that line for as long as possible.
Last Edit: December 16, 2013, 02:12:37 PM by Crafty_Dog
Reply #1 on:
August 24, 2007, 06:46:41 PM »
One of the nicest, newest, cleanest buildings on the UCSB campus is the Nanotechnology Center. Apparently, they are getting mucho dinero in the form of research grants. I'd love to walk around and check it out, but you need security clearance and a pass to get into the interesting parts.
As a matter of fact, QuikClot (which is used in battlefield med applications) was invented by one the the University's professors.
Reply #2 on:
September 12, 2010, 07:59:29 PM »
Here comes the Bot's.
Reply #3 on:
September 13, 2010, 07:58:30 AM »
Researchers develop a way to funnel solar energy
This could be revolutionary
P.S. this site is great
Reply #4 on:
September 14, 2010, 11:29:38 AM »
Seems very significant.
What an amazing world we live in!
Reply #5 on:
September 29, 2010, 08:21:25 AM »
Nano scale solar cells 10 times more efficient
Nanotechnology: Solar efficiency gains
Reply #6 on:
October 01, 2010, 12:45:28 AM »
Freki: Great find! I used to have that site bookmarked several computer crashes ago.
Rarickwrote: "The article claims a 12 fold increase. solar cells are about 15-20% efficient- call it about 200watts/m2 for the 1000 watts insolation that was tested and verified in the soutwest usa."
My guess is that this technology and others will show that the potential for energy capture is far greater than we thought. Also we are not all in the southwest. The sun also lights and heats the earth in places like Seattle as well where I doubt the solar efficiency of current systems is anywhere near the 15-20% range.
Except to judge the credibility of the author, it doesn't matter to me whether the gain from one breakthrough is 10% or 10-fold. The point is that good things are coming and government should not be paying you or mandating you to make a fifty year investment in a technology that constantly improving and obsolescing the old. Only free choices can fairly settle the questions of if, when and how to commit to an investment of that scope.
Reply #7 on:
January 17, 2011, 04:06:50 PM »
I found this website and it gives a nice little primer on nanotechnology and its application if anyone needs quick basics:
I'm not really a science genius but this is one field that I find absolutely fascinating!
home security systems
3D printer builds hand for five year old boy
Reply #8 on:
February 02, 2013, 02:30:08 PM »
3D Technology for guns
Reply #9 on:
April 09, 2013, 10:39:36 AM »
Newt Gingrich: Lessons in Innovation
Reply #10 on:
July 10, 2013, 11:07:02 PM »
3D Printing, Fossils and Lessons in Innovation
How do CT scans, the Internet and 3D printing interact to create a revolution in science and in learning?
We are releasing at Newt University today an interview with Tim Rowe which makes a very profound point about the nature of dramatic breakthroughs.
Tim is a paleontologist at the University of Texas in Austin.
For over a quarter century, he has consistently been a pioneer in applying new technologies to science and teaching.
Three parallel revolutions have empowered Tim in his quest for better science education and better tools for scientific research.
He has combined digital imaging (CT scans, MRIs, etc.) with the growing power of the Internet to store knowledge and share it for free with the rising capabilities of less and less expensive 3D printing.
Each of these technologies grew up separately. None depended on the other for their basic capabilities.
Yet it was the powerful synergistic interaction of the three which made possible real breakthroughs in scientific research, in learning, and in sharing knowledge with both amateurs and professionals.
Click here to view course
CT scans for medicine began in the 1971 and quickly spread around the world.
In 1984, Glen C. Conroy scanned the first fossil (an oreodont borrowed from the American Museum of Natural History), and made the cover of Science magazine. The initial power of the technology was the ability to see through rock and define a fossil without having to destroy the rock (and potentially other fossils).
In 1992, Tim made the first scans of a fossil (Thrinaxodon) using an industrial high-resolution CT scanner. The next year, he published the first compete CT dataset on CD-ROM. Once scientists had a digital model of the fossil, he realized, they could share it widely.
The second breakthrough was the ability to put the digital data on the Internet. Tim points out that the earliest fossil CT scans were preserved on floppy disks. That seemingly cutting edge effort is now totally obsolete and no one uses floppy disks anymore. It's hard even to access the data.
As the reach of the Internet grew, the ability to store and transmit information grew with it.
Today the Thrinaxodon scans Tim used to mail on CD-ROMs for $1 a copy are simply in the cloud and downloadable at no cost.
The National Science Foundation has a long history of investing in basic research and the tools of basic research. This is one such area. Much of Tim's work would have been impossible without steady support from long term thinkers at NSF.
All that consistent effort led to the creation of digimorph.org, which went live in 2001. You can go to this site and find some four thousand digitized fossils. To date they have served more than 3 Tb of data to 2.5 million unique visitors. In effect, Tim is pioneering a world museum of fossil information which will be available to first graders as well as graduate students and to interested citizens as well as the most advanced professionals.
The combination of digital CT scans and the Internet were two of the three key ingredients to this revolution in learning and research.
The third was the development of 3D printing.
Now that we have relatively inexpensive devices for printing out three dimensional models (including devices sold at Staples and Amazon), there is a wide range of new possibilities.
I stood at the University of Chicago with Dr. Zhe-Xi Luo, a leading expert in early mammals. Early mammals are very tiny. Many of their jaws are smaller than one half the size of your little finger. Some are almost microscopic. They are frustrating to study and analyze.
Suddenly you can scan these tiny fossils and then print out an exact replica ten times (or more) their real size. It is now possible to hold the jaw and analyze the teeth with an ease and an accuracy that was unthinkable 20 years ago.
This combination of digital data, the Internet, and 3D printing will lead to a revolution in teaching science.
First and second graders will be able to hold their own personal copy of a T-Rex tooth. They will be able to study and analyze fossils that would once have been available only to a handful of very advanced researchers.
Tim notes that the power of 3D printing combined with digital information is going to spread far beyond science. He suggests, for example, that the next phase of hip replacements will involve getting an exact digital record of your hip socket and then printing out a unique hip replacement designed specifically for you. (The next stage after that will be growing new hips through regenerative medicine, a topic I will write about in the near future.)
Tim is a great pioneer of the future of science and technology.
He has opened up the world for future students of fossils and for the potential to create a digital library of all living organisms.
He is also a good example of how entrepreneurial breakthroughs occur in unplanned but energetic and persistent ways.
It would have been impossible to imagine at the beginning of Tim's career that these technologies would evolve as they have. It would have been impossible to forecast how they would complement one another.
If you get a chance to watch his interview I think you will agree that his is a remarkable story. He has helped to open up science for millions for of people. Click here to watch our conversation.
Reply #11 on:
August 06, 2013, 11:48:29 AM »
My friend comments:
"Interesting article and who stands to benefit - one new name (for me, at least) is Silver Spring Networks SSNI."
Trillions of Smart Sensors Will Change Life as Apps Have
By Olga Kharif - Aug 5, 2013 12:01 AM ET
A sprinkler system sprays crops with water from an irrigation canal set in an agricultural area which traditionally uses water from the Colorado river distributed through a series of canals and irrigation channels in Imperial Valley, California.
In February, six students on snowshoes battled as much as 12 feet of snow to penetrate into the heart of the American River basin. Moving through dense forests and meadows, they mounted 90 iPhone-sized machines, designed to measure everything from soil moisture to temperature and relative humidity, onto 16-foot poles that beam data to researchers like Steven Glaser, a professor at the University of California, Berkeley. With additional trips this summer, Glaser hopes to create the world’s largest sensor network, comprising 7,500 devices that will inform researchers and government agencies for the first time in detail how much water California has in its coffers -- critical data for farmers and state planners. The network will be among the largest tests of a new kind of sensor: one that feels as well as thinks, while using very little power -- a D-cell battery can last years.
Glaser’s gadgets come equipped with silicon from Linear Technology Corp. (LLTC) andCypress Semiconductor Corp. (CY) that turns them into mini-computers. They’re part of a generation of intelligent sensors whose sales may rise about 10 percent a year to reach $6.9 billion in 2018, according to Transparency Market Research. Unlike dumb predecessors that gathered data and passed it to a central server to analyze, these devices monitor the information’s quality and perform advanced calculations.
“It’s smart cities, smart buildings, smart water,” said Susan Eustis, president of WinterGreen Research Inc. “It’s enabling a world of things. It’s going to grow unbelievably fast.”
The market for sensors integrated with processors will reach 2.8 trillion devices in 2019, up from 65 million this year, according to WinterGreen. Some of these sensors could be no larger than a pinhead.
Linear went into full production with its new system for smart sensors, complete with a 32-bit processor based on ARM Holdings Plc (ARM) technology, this month. International Business Machines Corp. (IBM), Freescale Semiconductor Ltd., Qualcomm Inc. (QCOM), Silver Spring Networks Inc. (SSNI), Sensus USA Inc. and Streetline Inc. are designing more powerful and capable processors or sensors as well. Smart-sensor equipment maker Silver Spring held an initial public offering in March, and has seen its shares surge 88 percent since.
One early example of the industry’s potential is Nest, the thermostat that can be adjusted with a mobile application and learns your temperature preferences. Startup Colorado Micro Devices Inc. has built a prototype sensor that notifies owners if a door is locked via a messaging service.
“I think it’s going to be huge, the way all sensors are going to operate,” said Linear Chief Technology Officer Kris Pister. “A good analogy is the phone and apps. Ten years ago, no one knew what an app was.”
One reason for the surge is that it’s becoming important to have sensors quickly separate important data from the chaff before sending it to a central server.
“We are swimming in sensors, and drowning in data,” said Dharmendra Modha, a principal investigator at IBM. “Sensory data is growing at such a rate that our ability to make sense of it is highly constrained.”
One gas turbine, for instance, can have 100 sensors that generate 1,000 pieces of data every second, according to General Electric Co. (GE) Smarter sensors may only alert the central computer of something out of the ordinary, indicating overheating or another potential failure.
“The more you can rely on sensor networks to control some aspects, the more efficient it can be,” said Jonathan Collins, an analyst at ABI Research.
One drawback of many smart sensors is cost. In Glaser’s water project, the sensors may seem pricey at $500 a pop -- yet “if you are talking about the billions of dollars value for water, $500 per node isn’t expensive,” Glaser said.
As the number of installed sensors grows, average selling prices will drop to less than $1 each from about $50, Eustis said.
As part of a research collaboration, a year ago IBM announced a special low-power processor for sensors. Its design is more akin to how the brain functions than today’s computer processors. Applications could include a grocer’s glove that smells produce to find contaminated products, and a tiny, jellyfish-like device that floats on the ocean’s surface and collects data on oil spills and tsunamis, Modha said. Yet another use: smart glasses that guide the visually impaired.
“Because of its low power and small size, the sensor becomes the computer,” Modha said. IBM and its partners have more than 50 people working on the project.
Sensus, another sensor-products maker, is planning to introduce more powerful, ARM-based microcontrollers for sensors, said Randolph Wheatley, executive vice president of corporate marketing at the company.
“What we see is a need for more processing power and more memory to store more data, and a need to do it in a more power-efficient way,” he said. A sensor that notices a building using more water than usual may suspect a leak and shut off the main.
Streetline is building smart sensor systems using microcontrollers from Texas Instruments Inc. (TXN) that help manage parking in 35 cities. Related mobile apps tell consumers where parking spaces come available, and let cities adjust parking fees based on demand.
“We are still at the beginning stages of this,” said Geoff Mulligan, chairman of industry group IPSO Alliance, which focuses on enabling communication among smart objects. Its 60 members include Oracle Corp. (ORCL), Nokia Oyj (NOK1V) and Cisco Systems Inc. (CSCO), which is working with Streetline to install a sensor network to monitor parking spaces on its campus.
“We’ve not even conceived the possibilities of applications for this,” Mulligan said. “We are at a tipping point for it to begin to explode.”
To contact the reporter on this story: Olga Kharif in Portland at
To contact the editor responsible for this story: Pui-Wing Tam at
WSJ: Andy Kessler: Tech Revolution will create jobs, growth
Reply #12 on:
August 08, 2013, 08:54:27 AM »
Robots, 3-D Printers and Other Looming Innovations
These new technologies will create far more high-paying jobs than they destroy.
By ANDY KESSLER
Is technology to blame for our stubborn unemployment? President Obama scored ATMs and airport kiosks in a 2011 "Today Show" interview, blaming them for "structural issues with our economy." Other technophobes have piled on. Former Labor Secretary Robert Reich wondered late last year: "What if we're stuck at a new normal of high unemployment and low job growth? It's possible because technology might just have gotten the best of us." In March, Salon.com declared "Your iPhone kills jobs." And in May, the Economist suggested "technology may destroy more jobs than it creates."
These Luddites are wrong.
The road to wealth does indeed pass through the graveyard of today's jobs. But history shows that better, higher paying jobs are always created by technology—even if no one seems to remember this during periods of creative destruction.
The trick is to lower the cost of new machines and inventions that can do things never before possible, making them available for wide use. Here are a few recent examples that could be economic game-changers and job-creators:
• Three-dimensional printing. By now, you've probably heard of 3-D printers, the gadget du jour for geeks that creates objects from computer models, building them layer by layer. Right now, they are priced between $1,000 and $10,000, with the most expensive version able to print multiple materials.
Some 50,000 have been sold so far, mostly to designers in order to create expensive prototypes. But by next year, the expiration of a key patent on "laser sintering," a method that fuses materials with lasers, will mean cheaper 3-D printers and the growth of an industry of producers of hard-to-find auto parts, industrial designs and even jewelry. Some are even talking about the possibility of printing human tissue.
Robotic painting arms are tested in the underbody sealing and coating area of the Chrysler Group LLC Assembly Plant in Sterling Heights, Michigan in July.
This reminds me of laser printers, which now cost $200, but cost $17,000 when the Xerox Star 8010 hit the market in 1981. Today, most magazines and office jobs couldn't exist without them.
• Blood markers. As ObamaCare inevitably drives health-care costs higher, it will be in the best interest of insurers to detect disease early to prevent the high cost of treatment. This is particularly true with cancer. Each cancer tumor expresses unique proteins that end up in our blood streams. Once found, these markers can be used to identify the disease before it spreads.
Postgraduate biochemists have been working with $100,000 mass spectrometers analyzing results in studies. This spring, blood markets were even discovered for Alzheimer's. As cheaper machines come on line in the next few years, the process will go mainstream. In turn, researchers will invest in more machines and will need to hire thousands of scientists to man them.
• Gene therapy. If cancer is detected, it can be cut out, radiated or drugged. But often the underlying gene mutations that cause cancer and other diseases persist.
The era of gene therapy may be upon us. By hacking genes—in effect, genetically engineering a retrovirus programmed with DNA to overwrite a particular mutation—the body then does the work of reversing disease. Dogs have been cured of Type 1 diabetes with gene therapy, and Glybera, a gene therapy treatment, has been approved by the European Union for treating people with lipoprotein deficiency.
The equipment to do this—they are called Fast Protein Liquid Chromatopraphy and Polymerase Chain Reaction—already exists and I suspect will get cheaper with volume. This requires smart scientists at every hospital to identify mutations, harvest stem cells and then reprogram and re-inject them.
This is much like gene splicing, which Genentech used 30 years ago to create insulin, and the ever-cheaper DNA sequencing of the last decade. Gene therapy will be a boon for biologists and chemists.
• Funding platforms. Last year in the U.S., venture capitalists, who have traditionally funded our entrepreneurs, invested just $27 billion in 3,800 companies. It's inefficient.
Fortunately, new funding platforms like Kickstarter and Indiegogo allow entrepreneurs and creative types to appeal directly to users to fund their projects, much as eBay and Craigslist helped drive individual merchants.
Kickstarter has helped raise $619 million for more than 45,000 projects. I expect this to grow 100-fold over the next decade, and each funded project will create new jobs.
• Robots. Robots have already been welding car parts, making the chips in your iPhone and moving things around warehouses for many years. But now they are getting personal. Not so much R2-D2, but programmable for repetitive tasks.
At $22,000, Baxter from Boston's Rethink Robotics is easy to train and human-friendly (it stops if you stick your hand near it.) Yes, it's cheaper than a human worker, but like laser printers and soon 3-D printers, it will augment rather than replace people.
Robots might bring back some mass manufacturing from offshore, but the real upside is that they can do things that haven't been done before. Like programmable machine tools, these more personal robots will change the way small- and medium-size manufacturers operate, lowering costs and creating a hiring binge for those savvy enough to use them.
Add the above to the manufacturing renaissance driven by fracking, and tens of millions of new jobs will be created by technology. Most of these great, higher-paying jobs will require smart college graduates or retrained workers. But, as always, each great job creates a handful of good jobs, such as lab assistants and human resources professionals, and another handful of fine jobs in food service and retail.
Just as politicians and policy makers are currently blaming these new technologies for destroying jobs, once the technologies start creating jobs, those in Washington will inevitably take credit. These types are best advised to clear a path and stay out of the way.
Mr. Kessler, a former hedge-fund manager, is the author most recently of "Eat People" (Portfolio, 2011).
Reply #13 on:
August 26, 2013, 07:35:38 AM »
Wonder Material Ignites Scientific Gold Rush
Atom-Thin Graphene Beats Steel, Silicon; A Patent "Land Rush"
Graphene is an extremely thin, strong and flexible material derived from the graphite found in everyday pencils. Scientists are racing to exploit those attributes for an array of new applications. WSJ's Gautam Naik reports. Photo: Daniella Zalcman.
CAMBRIDGE, England—A substance 200 times stronger than steel yet as thin as an atom has ignited a global scientific gold rush, sending companies and universities racing to understand, patent and profit from the skinnier, more glamorous cousin of ordinary pencil lead.
The material is graphene, and to demonstrate its potential, Andrea Ferrari recently picked up a sheet of clear plastic, flexed it and then tapped invisible keys, triggering tinkly musical notes.
The keyboard made at Dr. Ferrari's University of Cambridge lab was printed with a circuit of graphene, which is so pliable that scientists predict it will fulfill dreams of flexible phones and electronic newspapers that can fold into a pocket.
Dr. Andrea Ferrari, head of graphene research at the University of Cambridge, inspects equipment used for experiments on the atom-thick material.
It is the thinnest material known. But it is exceedingly strong, light and flexible. It is exceptional at conducting electricity and heat, and at absorbing and emitting light.
Scientists isolated graphene just a decade ago, but some companies are already building it into products: Head NV HEDYY 0.00% introduced a graphene-infused tennis racket this year. Apple Inc., AAPL -0.39% Saab AB SAAB-B.SK -0.08% and Lockheed Martin Corp. LMT +1.85% have recently sought or received patents to use graphene.
"Graphene is the same sort of material, like steel or plastic or silicon that can really change society," says Dr. Ferrari, who leads a band of about 40 graphene researchers at Cambridge.
Graphene faces hurdles. It is still far too expensive for mass markets, it doesn't lend itself to use in some computer-chip circuitry and scientists are still trying to find better ways to turn it into usable form. "Graphene is a complicated technology to deliver," says Quentin Tannock, chairman of Cambridge Intellectual Property, a U.K. research firm. "The race to find value is more of a marathon than a sprint."
Interest in graphene has exploded since 2010, when two researchers won a Nobel Prize for isolating it. Corporate and academic scientists are now rushing to patent a broad range of potential uses.
"As soon as I find something, boom! I file a patent for it," says James Tour, a graphene expert at Rice University in Houston.
Apple has filed to patent graphene "heat dissipators" for mobile devices. Saab has filed to patent graphene heating circuits for deicing airplane wings. Lockheed Martin this year was granted a U.S. patent on a graphene membrane that filters salt from seawater using microscopic pores.
Others have applied for patents on graphene used in computer chips, batteries, flexible touch screens, anti-rust coatings, DNA-sequencing devices and tires. A group of scientists in Britain has used a graphene membrane to distill vodka.
There were 9,218 published graphene patents and patent applications filed cumulatively as of May around the world, up 19% from a year earlier, says Cambridge Intellectual. Over the past five years, it says, the cumulative number of graphene patent filings has more than quintupled.
"It's a land grab," says Mr. Tannock of Cambridge Intellectual. By trying to patent just about every finding, "you have the option for suing your competitors later and stopping them." Many graphene patent filings appear legitimate, but some seem speculative and others may be decoys to mislead rivals, he says.
'As soon as I find something, boom! I file a patent,' says an expert in graphene, which coats the disc, pictured.
Graphene's biggest short-term promise is in high-speed electronics and in flexible circuitry such as that in Dr. Ferrari's keyboard, because of expected demand for use in pliant electronic displays. Companies such as South Korea's Samsung Electronics Co. 005930.SE +0.39% and Finland's Nokia Corp. NOK1V.HE -1.10% have filed for patents covering various graphene uses in mobile devices.
One of the hottest areas is graphene ink used to lay down circuitry, which a few companies have begun to sell. Dr. Ferrari's lab last year filed for a patent on a graphene ink that can be deposited by inkjet printers. BASF SE BAS.XE -0.45% is experimenting with graphene ink to print flexible circuits into upholstery that can heat car seats, a technology it says could be in the market in a few years.
"Graphene combines various effects" that make it distinctive, says Matthias Schwab, a lab team leader in BASF's graphene-research operation. "I am seeing no other materials that can do it."
In effect, graphene has only two dimensions, in a microscopic structure that resembles chicken wire. In a study published five years ago, Columbia University researchers concluded it was the strongest material measured. They calculated it would take an elephant balanced on a pencil to puncture a graphene sheet the thickness of Saran Wrap.
It absorbs and emits light over the widest range of wavelengths known for any material. It conducts electricity far better than silicon. Unlike silicon, which is brittle, graphene is flexible and stretchable.
Graphene circuitry promises to eventually be cheaper than conductive materials such as copper and silver because it can be made from graphite—the plentiful stuff of an ordinary pencil lead—and can also be created by combining certain gases and metals, or synthesized from solid carbon sources.
Rice University's Dr. Tour demonstrated in 2011 that graphene can be synthesized using carbon from sources as diverse as grass, Girl Scout cookies and cockroach legs.
Dr. Tour's lab has filed for multiple graphene patents, including for ribbons to reinforce composites that he says are strong enough to use in high-pressure natural-gas tanks that can be molded into cars. Patenting quickly, he says, "gives us a foothold on the technology."
One factor holding graphene back is cost. Some U.S. vendors are selling a layer of graphene on copper foil for about $60 a square inch. "It needs to be around one dollar per square inch for high-end electronic applications such as fast transistors, and for less than 10 cents per square inch for touch-screen displays," estimates Kenneth Teo, a director at the Cambridge unit of Germany's Aixtron SE AIXA.XE -1.16% that makes machines to produce graphene.
Graphene must often be combined with other materials to exploit its properties, and scientists are still trying to figure out how to do that effectively.
It also has a significant shortcoming: It can't easily be made into a switch. International Business Machines Corp. IBM +0.12% was initially optimistic about using graphene in computer chips but found electrons travel too fast in it to switch off easily, making it hard to turn current into the "ones" and "zeros" of digital code.
Labs around the world are trying to solve the problem. But for now, "we don't see graphene replacing silicon in microprocessors," says Supratik Guha, director of physical sciences at IBM's research unit, who says he remains a big proponent of graphene. IBM is a major graphene-patent filer.
Graphene could still meet the fate of other touted materials that failed to live up to their promise. The discovery of high-temperature superconductors garnered a Nobel Prize in 1987 and led to a flood of patents and predictions of technologies such as superfast magnetically-levitated trains. The world is still waiting.
That still leaves plenty of scientific enthusiasm. In 2012, scientists published 45% more papers on graphene than in 2011, according to Thomson Reuters Web of Science, an index of journals.
It's a global race: Chinese entities had filed for the most graphene-patent applications cumulatively as of May, followed by U.S. and South Korean filers, says Cambridge Intellectual. Samsung accounted for the most filings, followed by IBM and South Korea's Sungkyunkwan University.
While labs work out graphene's kinks, some of the patents have found their way into products. Vorbeck Materials Corp., of Jessup, Md., makes a graphene ink it says is being used to print circuits in antitheft packaging in a few U.S. stores, which it declined to name.
Head's racket is reflected in an application it filed for a patent on graphene in a wide range of sports gear, from golf clubs to ski bindings. A Head representative referred inquiries to its website, which says graphene's strength lets it use less material in the racket, allowing the designer to redistribute the weight for more power.
Bluestone Global Tech Ltd., a Wappingers Falls, N.Y., startup, makes graphene sheets it says it ships to customers in the U.S., Singapore and China. "Within half a year, graphene will be used for touch screens in commercially available cellphones," predicts Chung-Ping Lai, its chief executive officer.
The graphene frenzy was unimaginable before 2003, when many scientists believed an atom-thick layer of anything couldn't keep from falling apart.
That year, Andre Geim stumbled upon graphene's wonders. A Russian-born scientist at the University of Manchester in Britain, he wanted thin graphite to study its electrical properties. A doctoral student suggested using cellophane tape.
Dr. Geim and his colleagues used the tape to peel off layers of graphite until they got to a layer so thin it was transparent. When they could peel no further, they had graphene. Not only did it not fall apart, it was strong, flexible and possessed astonishing electrical properties.
Other scientists were initially skeptical. "Not many people believed us," says Dr. Geim. But by March 2006, when he presented at a Baltimore conference, his session was packed, recalls Cambridge's Dr. Ferrari. "Finally, I understood how significant the material was going to be," he says.
In 2010, Dr. Geim and a colleague, Konstantin Novoselov, won the Nobel Prize in physics for their graphene work. By that time, corporate labs, universities like Rice and Harvard University, and academic institutions in China had begun to increase graphene research. In 2010, Japanese and South Korean scientists unveiled prototype graphene touch screens.
Labs at Samsung and Sungkyunkwan University, in particular, began to stand out for the volume of their research. "Although the basic research on graphene started in Europe and the U.S., the early research for commercial applications started in Korea," says Changgu Lee, a Sungkyunkwan graphene researcher. "We want to keep the lead."
A Samsung spokeswoman declined to comment on the company's graphene work.
Among those expressing enthusiasm for graphene is the U.S. military. In late 2011, the U.S. Army Research Laboratory in Adelphi, Md., signed an agreement to study graphene's properties with Northeastern University in Boston. The agreement is mainly funded by a $300,000 grant from the Defense Advanced Research Projects Agency, or Darpa.
The university plans to use graphene to design better night-vision goggles and other such detectors, says Srinivas Sridhar, a Northeastern physics professor. A Darpa representative, in an email, confirmed the project.
A walk through Dr. Ferrari's labs this summer gave a window into the research. One of his associates, Felice Torrisi, showed how tape could peel graphene from a graphite clump. "This is obviously not scalable" for industrial purposes, said Dr. Torrisi.
That speaks to a big goal in the graphene race: finding the best ways to manufacture it. A large number of patent filings describe methods of manufacturing graphene.
Dr. Torrisi next held up a vial of ink consisting of graphene in water. A nearby inkjet printer whizzed away, depositing the ink on a plastic sheet to form a near-invisible circuit. Ink printed on plastic was the trick behind the keyboard Dr. Ferrari tapped to trigger music from attached electronics.
In other Cambridge lab rooms, researchers showed off an early prototype of a graphene-based laser that can shoot out ultrafast pulses of light and graphene sensors that can detect any wavelength of light.
Graphene's heat-conducting properties appear to be at the heart of Apple's patent application, which includes drawings of a graphene "heat dissipator" behind components in a "portable electronic device." An Apple spokeswoman declined to comment.
Saab wants to take advantage of graphene's lightness and conductivity by embedding it in wings for deicing. The research is still in early stages, "but it is certainly part of our plan for introducing flying applications," says Mats Palmberg, who oversees future products at Saab's aeronautics unit.
Lockheed expects its graphene membrane to be "more effective at seawater desalination at a fraction of the cost" of current technologies, it says in a news release.
The discovery of graphene has also led scientists to hunt down scores of other two-dimensional materials with unusual properties, says Dr. Geim, the Nobel laureate. "Graphene opened up a material world we didn't even know existed."
Graphene, Carbyne? I need GG!!!
Reply #14 on:
September 02, 2013, 09:12:20 PM »
I have been getting some email alerts about graphene having some sort of incredible characteristics. Being single atom thin with incredible strength, more than any other material. Having something like twice the conducting properties of copper. Being flexible. And incredibly light. Leave it to the geniuses at MIT to find something that may be even better, carbyne. Could either of these replace silicon? GG where are you?
Jonah Goldberg on the Implications of Technology
Reply #15 on:
December 07, 2013, 12:22:51 PM »
The Goldberg File
By Jonah Goldberg
Dec. 6, 2013
Dear Reader (including our new robot overlords),
The above "Dear Reader" gag while not technically funny ("Is it funny in some non-technical sense we cannot discern?" -- The Couch) is a reference to my column today which makes the point that the minimum wage is a boon to robots. If you make human labor more expensive, non-human labor becomes more attractive. If you tell car-wash owners that they have to pay their employees $100 an hour, the owner will most likely search his desk for that business card from that salesman from Acme Robots.
Robots have lots of things going for them. They don't steal from the cash register. They don't show up late with some sob story about how their dog ate their car keys. They don't spit in the customer's food or lick the tacos and post pictures of it on the Internet. Robots don't file sexual-harassment suits just because you got over-served at the Christmas party and thought it would be funny to hand out photocopies of your butt.
Thingamabobs Have Consequences
Long time readers of mine might think of taking a speed-reading course so they don't have to take so long reading. They might also recall that I think technology is a greater challenge to conservatism than ideas are. I wrote about this at some length in a G-File last February.
My main point is that conservatism -- full-spectrum, traditional conservatism and not just a checklist of timeless principles, or a political agenda in Washington -- requires an appreciation, even love, for the way things are. And technology forces change more than ideas do (indeed, many of our ideas are simply the sparks that fly from the friction of technological change). Sure, Richard Weaver was right when he said, "Ideas have consequences." But you know what are really consequential? Thingamabobs, geegaws, doohickeys, and whoziwhatsits.
Technology is why Thulsa Doom goes looking for iron, Sauron wants his ring, and why everyone in Westeros wants a dragon (while technically not technology, dragons are a reasonable substitute for, say, an F-16). But I'm not going to get into a whole "power flows from the tip of a gun" argument. Instead, think about all of the things you associate with the traditional agrarian life -- the cocaine, the hookers, the creepy guy with sideburns wearing nothing but a Members Only jacket and speedo following everyone around with a video camera -- oh, wait those are the things we associate with the Arkansas governor's mansion in the 1980s when Hillary Clinton was out of town. I meant to say, the daily toil in the sun, the intimate relationship with the land, the reliance on the benevolence of God and Nature to provide sufficient rain and sun, etc.
Those physical necessities were intimately linked to emotional commitments and intellectual convictions. Now think about what modern technology did to all of that. The tractor, modern irrigation, pesticides, industrial fertilizers, biotechnology: These things did more to upend settled worldviews than any stupid French or German ideas ever could. But it's easy to argue with some French pinhead at a café; it's more difficult to argue with a tractor, and not just because tractors don't talk ("They might, buddy." -- The Couch). No, it's more difficult to argue with a tractor because a tractor is an obvious improvement. It speaks through results, nothing more.
As I've written before, accepting the role technology has in changing the political facts on the ground is what Whittaker Chambers called "the Beaconsfield position" (Edmund Burke was from Beaconsfield). In a letter to Bill Buckley he wrote:
Briefly, I remain a dialectician; and history tells me that the rock-core of the Conservative Position, or any fragment of it, can be held realistically only if conservatism will accommodate itself to the needs and hopes of the masses -- needs and hopes, which, like the masses themselves, are the product of machines. For, of course, our fight, as I think we said, is only incidentally with socialists or other heroes of that kidney. [Essentially], it is with machines. A conservatism that cannot face the facts of the machine and mass production, and its consequences in government and politics, is foredoomed to futility and petulance. A conservatism that allows for them has an eleventh-hour chance of rallying what is sound in the West. All else is a dream, and, as [Helmuth] von Moltke remarked about universal peace, "not a very sweet dream at that." This is, of course, the Beaconsfield position. Inevitably, it goads one's brothers to raise their knives against the man who holds it. Sadder yet, that man can never blame them, for he shares their feelings even when directed against himself, since he, no less than they, is also a Tory. Only, he is a Tory who means to live. And to live is not to hold the lost redoubt. To live is to maneuver.
I know many liberals agree with this sort of argument. They use it to advance the idea that the Constitution is outdated -- "the Founding Fathers didn't know about airplanes!!!!! So therefore guns for no one and abortions for everybody!" -- but I take from this the opposite conclusion. I believe in encouraging innovation, yet I also think the rapidity of technological change should make us revere enduring institutions more, not less. Normally, it'd be around here that I'd bring up Chesterton's fence again. But I've been getting my Burke on of late. I review Yuval Levin's wonderful book, The Great Debate, for the upcoming issue of Commentary and it's had me rereading and renoodling a lot of stuff
I couldn't get too deep into it in the review, but one of the things I find most interesting in Burke is his Hayekian side. He doesn't believe that the past is a repository of genius or perfection, far from it. He believes the present is better than the past -- at least his present -- and that society should move toward a perfect, albeit unattainable, ideal. But what makes improvement possible is continuity with the past, not breaking from it.
Yuval contrasts Burke's views with Thomas Paine's faith in the "Eternal NOW" (the all caps are Paine's). Paine believes every generation is the only game in town and it needs to align everything with its needs and principles. Burke believes that each generation inherits an already existing society from its parents and is obliged to try to leave it in slightly better shape for the next generation. If "the whole chain and continuity of the commonwealth would be broken and no one generation could link with the other," Burke writes, then "men would become little better than the flies of a summer."
Burke, of course is right. The challenge for each new generation is figuring out what's worth keeping and what worth tinkering with. The progressive attitude is that everything is eligible not just for tinkering, but wholesale replacement. The people who lived yesterday were idiots, but we are geniuses! The conservative attitude is to assume that our parents and grandparents weren't fools and that they did some things for good reasons. But -- and here is the Hayekian part -- it's also possible that some things our forebears bequeathed us are good for no "reason" at all. Friedrich Hayek argued that many of our institutions and customs emerged from "spontaneous order" -- that is they weren't designed on a piece of paper, they emerged, authorless, to fulfill human needs through lived experience, just as our genetic "wisdom" is acquired through trial and error. Paths in the forest aren't necessarily carved out on purpose. Rather they emerge over years of foot traffic.
This reminds me of a story Kevin Williamson tells in his book.
There is a lovely apocryphal story, generally told about Dwight D. Eisenhower during his time as president of Columbia University: The school was growing, necessitating an expansion of the campus, which produced a very hot dispute between two groups of planners and architects about where the sidewalks should go. One camp insisted that it was obvious -- self-evident! -- that the sidewalks had to be arranged thus, as any rational person could see, while the other camp argued for something very different, with the same appeals to obviously, self-evident, rational evidence. Legend has it that Eisenhower solved the problem by ordering that the sidewalks not be laid down at all for a year: The students would trample paths in the grass, and the builders would then pave over where the students were actually walking. Neither of the plans that had been advocated matched what the students actually did when left to their own devices. There are two radically different ways of looking at the world embedded in that story: Are our institutions here to tell us where to go, or are they here to help smooth the way for us as we pursue our own ends, going our own ways?
The paths were formally recognized by the planners only after the paths were created through human experience. In the parable of the fence, Chesterton says you must know why the fence was built before you can tear it down. But Burke and Hayek get at something even deeper: What if no one built the fence? Okay, that would be weird. But metaphorically, what if no one built it. Or what if everyone built the fence without realizing it. What if we are surrounded by fences that were never consciously built or planned but were instead the natural consequence of lived experience like the footpath at Columbia?
My inner Hayek and Burke believes this to be the case. So much of what makes civilization civilized is intangible, spontaneous, and mysterious. An unknowable number of our greatest laws are hidden, our greatest wisdom is authorless, and our most valuable treasures are in our hearts. This should foster enormous humility about how to out-think humanity. The rules should follow the experience, whenever possible, not the other way around. Burke once told a friend that "every political question I have ever known has had so much of the pro and con in it that nothing but the success could decide which proposition was to have been adopted."
Thomas Paine, anticipating a lot of anarcho-libertarian types, believed that if you looked back to the founding of monarchies you'd find mere villains and brutes who established themselves as better than their fellow men by force of arms. And he was right! A point Burke sort of tacitly acknowledged. But his response to such claims was, in effect, so what? Of course, there was evil in the past, of course mistakes were made. But as societies advance, they slowly -- sometimes too slowly (See, "Slavery, U.S.) -- correct out the mistakes or simply build successes on top of them. This is something I was getting at in The Tyranny of Clichés in the chapter on dogma.
It is true that many dogmas are built upon mistakes. But that doesn't mean the resulting edifice is not worthwhile. A ship may sink because of the blunder of the captain, but the resulting sunken wreckage beneath the waves may serve as a bountiful reef supporting a wealth of new life. So it is with humanity and her institutions. Columbus "discovered" America by mistake and the world is better for what was built upon that mistake. How many beloved children were born thanks to some capricious accident? We are told that the institution of monogamous marriage between a man and a woman was a mistake, unchartered by the laws of evolution and unlicensed by the conclusions of science. Maybe so. But what was built upon the rock of that "mistake" is not so easily or desirably undone even if we are willing to admit the existence of an error committed somewhere in the ancient recesses of prehistory. If tomorrow science tells us that it would make more sense to make stoplights green instead of red, the price of the resulting chaos would not be worth the gains in rational organization. Indeed, a reasonable man understands that the costs of ripping up the old and tried are often too expensive for the theoretical promises of the new and untried.
Where Was I?
Oh right, robots. I am open to the idea that our robot future will be super-terrific awesome. But I am far from convinced. Indeed, I'm downright nervous about it. Humans find happiness through finding meaning in their lives. For many of us that comes from faith, family, and friends. But it also comes from work -- both in the occupational sense, but also in the sense of struggling to accomplish something. I don't think there's nobility in poverty, but I do think there's nobility in work, even menial work. Indeed, as anyone who has had a menial job will attest, they can be the most rewarding, because they build good character and ingrain good habits.
Now, it's entirely possible that robots will free lots of people from drudgery and let them become full-time spoken-word poets or basset-hound wranglers. Maybe robots will make it easier for us to do complete re-enactments of the mall chase scene from the Blues Brothers in Lego. Note, I said "easier" not "possible" because it's already been done. I for one would be delighted to have a permanent robot slave rub my feet while I write this "news"letter. But when you look at what is already happening to men in our society as good-paying strong-back jobs vanish, I can't help but worry that robots won't just take menial and dangerous jobs, they will, for some people at least, also take many of the most redeeming habits of the human heart as well.
WSJ: Move to reform Patent Law
Reply #16 on:
December 16, 2013, 10:24:38 AM »
Jimmy Carter's Costly Patent Mistake
His 1979 proposal has led to ill-conceived protection for software ideas and a tidal wave of litigation.
L. Gordon Crovitz
Dec. 15, 2013 6:31 p.m. ET
Washington doesn't agree on much, but all three branches of government now have plans to reform the country's patent system. What's not widely understood is that this marks the failure of one of Washington's most ambitious experiments in industrial policy.
Today's patent mess can be traced to a miscalculation by Jimmy Carter, who thought granting more patents would help overcome economic stagnation. In 1979, his Domestic Policy Review on Industrial Innovation proposed a new Federal Circuit Court of Appeals, which Congress created in 1982. Its first judge explained: "The court was formed for one need, to recover the value of the patent system as an incentive to industry."
The country got more patents—at what has turned out to be a huge cost. The number of patents has quadrupled, to more than 275,000 a year. But the Federal Circuit approved patents for software, which now account for most of the patents granted in the U.S.—and for most of the litigation. Patent trolls buy up vague software patents and demand legal settlements from technology companies. Instead of encouraging innovation, patent law has become a burden on entrepreneurs, especially startups without teams of patent lawyers.
Samsung and Apple attorneys battle over software patents, Nov. 15. Reuters
Until the court changed the rules, there hadn't been patents for algorithms and software. Ideas alone aren't supposed to be patentable. In a case last year involving medical tests, the U.S. Supreme Court observed that neither Archimedes nor Einstein could have patented their theories.
Many software patents simply describe ideas that happen to be carried out through digital technology. Amazon got a patent for the concept of "one-click checkout." Apple AAPL +1.27% last year applied for a patent on the idea of offering author autographs for e-books. There are so many software patents that smartphones include some 250,000 purportedly patented processes, which is why Google, GOOG +1.04% Samsung and Apple are suing one another around the world.
In software, innovations build on one another so seamlessly there is no way to follow them. There is no national registry of software. Developers and engineers can't track who claims patents to what processes. In contrast, drug researchers consult a publication called the Orange Book that lists all the patents for pharmaceuticals, enabling them to avoid infringements.
A system of property rights is flawed if no one can know what's protected. That's what happens when the government grants 20-year patents for vague software ideas in exchange for making the innovation public. In a recent academic paper, George Mason researchers Eli Dourado and Alex Tabarrok argued that the system of "broad and fuzzy" software patents "reduces the potency of search and defeats one of the key arguments for patents, the dissemination of information about innovation."
The Government Accountability Office agrees. "Many recent patent infringement lawsuits are related to the prevalence of low-quality patents; that is, patents with unclear property rights, overly broad claims, or both," it said in a recent report. "Claims in software-related patents are often overly broad, unclear or both." Boston University law professors Michael Meurer and James Bessen have estimated the direct and indirect costs of litigation against technology companies at $80 billion per year.
Instead of focusing on the problem with software patents, reforms backed by the White House and Congress would tweak patent litigation for all industries. The House this month passed a bill requiring more specificity in claims and limiting costly discovery, but doing nothing about dubious software patents.
The House rejected a proposal that would have expedited the process for the Patent Office to review questionable software patents. Lobbyists from companies like IBM IBM +2.33% and Microsoft, MSFT +0.49% which make billions of dollars a year from licensing software patents, helped block this reform.
For now, the best prospect for real reform is in the Supreme Court, which earlier this month agreed to hear CLS Bank v. Alice Corp., a case about whether a bank's computerized process for settling transactions via an escrow can be patented. A judge on the appeals court noted this idea was "literally ancient," developed during the Roman Empire, and should not get a patent now just because a computer is involved.
The Supreme Court has invalidated software patents in earlier cases, but the justices need to draw a brighter line with clear limits for the lower courts, especially the Federal Circuit. Simply qualifying ideas or business processes with the phrase "and do it on a computer" shouldn't be enough.
The justices should also acknowledge that creating a special court to promote patents is an experiment gone awry. Far from helping the economy, software patents are a litigation tax on new technology. The Constitution calls for patents "to Promote the progress of Science," not for patents to undermine innovation.
Freakishly realistic telemarketing robots
Reply #17 on:
December 16, 2013, 11:10:02 AM »
How Robots will change the World
Reply #18 on:
December 16, 2013, 02:14:01 PM »
Google/GOOG has purchased Boston Dynamics, a developer of advanced robots and related software for the U.S. military. Boston has "gained an international reputation for machines that walk with an uncanny sense of balance and...run faster than the fastest humans," the NYT writes.
Boston's products include Atlas, a humanoid robot able to handle difficult terrain; and Cheetah, the fastest legged robot in the world with a top speed of over 29 mph.
Boston is Google's eighth robotics acquisition this year, with the Web giant looking at manufacturing and retail applications.
A video of Boston Dynamics BigDog, scarily impressive. Especially when you consider this technology is from 5+ years ago...
Last Edit: December 16, 2013, 02:21:33 PM by Crafty_Dog
Robots repalcing many jobs
Reply #19 on:
December 19, 2013, 11:54:42 AM »
Reply #20 on:
January 01, 2014, 01:16:51 PM »
If accurate, this is mind-blowing:
3D robo hand
Reply #21 on:
April 16, 2014, 10:54:42 AM »
Kessler: Cheap smart phones
Reply #22 on:
May 13, 2014, 10:22:10 PM »
The Cheap-Smartphone Revolution
When the price hits $35 or less, it will have an astonishing global impact.
By Andy Kessler
May 12, 2014 6:48 p.m. ET
An iPhone costs $649. Other makers are talking about the $35 smartphone, or maybe even $25. That might explain why Apple AAPL -0.11% has been so litigious over patents and why the company is spending big—about 30% more than last year—to develop new features. On May 2, Apple was awarded $119 million in damages after a court ruled that Samsung had infringed on patents by copying Apple's features, designs and technology. But damage payments won't stop what's coming for the industry: We're entering a revolutionary era of the cheap smartphone.
About 285 million smartphones were shipped in the first quarter of 2014, according to Strategy Analytics, and more than a billion will ship this year. Not cellphones—smartphones. Samsung and Apple accounted for almost half of them.
The business is staggeringly lucrative. The research firm iSuppli rips apart smartphones to figure out what the materials cost. iSuppli estimates that the materials in an 16-gigabyte iPhone 5S cost $191, though the product sells for $649 without a contract with AT&T T -0.22% or Verizon. VZ -0.04% The iPhone 5C materials come in at $166, selling for $549 without a contract. The Samsung Galaxy S5 contains $251 of materials.
But we're not buying chips and glass. What we pay for is the experience of the look, feel and touch—for the software, operating system, graphical user interface and apps. Samsung and most of the other 85% non-Apple smartphones use Android, which Google GOOGL -0.28% provides free, making up for Android development costs by selling boatloads of search ads.
Apple thinks that its software is, unlike Android, worth more than free. The company filed the patent suit against Samsung and HTC to slow down Android's development, but also to try to maintain the value of Apple's code-writing that provides all the magical features. But it has been seven years since the iPhone was introduced. Commoditization, when consumers realize that your product is no different from what your competitor sells, is creeping up. When personal computers began, they sold for $5,000. Google now sells a laptop for $249. The same downward price pressure is about to happen with smartphones.
As Business Insider reported from the Mobile World Congress in Barcelona in February, the buzz was about a Chinese manufacturer showcasing a $35 smartphone and about Firefox's flirting with selling one for $25. My contacts in China report similar prices.
We're entering a new stage—call it the post-iPhone era. Cheap. Smart. Ubiquitous. Profits then flow to the best services that utilize smartphones. Facebook FB 0.00% is cleaning up with its mobile-ad sales. Twitter, TWTR +0.03% Snapchat and Instagram are all driven by smartphones. Uber and hundreds of new apps wouldn't exist without smartphones. It makes sense that Apple is interested in acquiring Beats Audio for a reported $3.2 billion. And all this is at a billion smartphones. What happens at three billion devices? Or four billion?
First, dirt-cheap smartphones will have astonishing implications for the global economy: Smartphones are a productivity platform for wealth creation. Americans may waste days playing Piano Tiles or Clash of Clans, but the developing world can build lives with a $35 smartphone. Roughly 20% of the world population earns less than $2 a day, and the cost of a smartphone just dropped from a year's earnings to three weeks' pay. Now that's an antipoverty program.
Google has launched Project Loon, bringing the Internet to rural and developing areas through high-altitude balloons. Facebook and Google are rushing to invest in drones to loop around and provide mobile connectivity. Poor villages and townships will finally have a platform to escape despair. Now we need applications to use $5-a-day workers to eyeball documents, photos, blueprints and anything that requires human cognitive skills, things that can't yet and may never be coded into artificial-intelligence algorithms. This is the greatest challenge for Silicon Valley that it doesn't even know about.
The other good news is for consumers in wealthy countries—those of us with "First World problems" like butt dialing. Apple and Samsung will struggle to maintain profits on high-end products, and so get ready for a horse race of features. We've already seen Apple's Touch ID fingerprint scanner. Samsung has Smart Scroll Eye Tracking. Apple's Siri answers spoken questions. Samsung offers a Galaxy Gear watch that works with their phones. None yet justify the $649 price tag.
A lot will be thrown at us to see what sticks. Larger displays? Curved or bendable displays? Fitness trackers? Google Glass? Bluetooth-connected rings to allow hand-waving gestures instead of requiring clicks? No-touch mobile payments? Medical sensors to track heart rate or glucose levels? Who knows? Let's try them all. Personally, I'm hoping someone comes up with a personal assistant to whisper in my ear throughout the day.
Mr. Kessler, a former hedge-fund manager, is the author, most recently, of "Eat People" (Portfolio, 2011).
Health \ Fitness related Technology
Reply #23 on:
June 01, 2014, 12:34:21 AM »
Two links that could possible fit into two different threads.
I first heard of Heart Variability Training from Joel Jamieson of
but it was pretty darn expensive. Years later I found this article and there is a cheaper version, I doubt that it does everything expensive version does but what the heck.
This article is pretty darn cool to me.
"You see, it's not the blood you spill that gets you what you want, it's the blood you share. Your family, your friendships, your community, these are the most valuable things a man can have." Before Dishonor - Hatebreed
THREE REASONS WHY APPLE WILL BRING DIGITAL HEALTH MAINSTREAM
Reply #24 on:
June 01, 2014, 12:40:02 AM »
There has been much discussion on when a big player such as Apple, Facebook or Google fully commit to digital health the industry will scale rapidly. Predictions say that when this happens the sociological tipping point will create a paradigm shift in much the same way the iPhone did for apps and mobile computing or like Amazon did for publishing.
While we aren’t there yet it seems we are moving in that direction and if one large corporation is helping to steer us there more than anyone else it would be Apple. Here’s three reasons why.
"You see, it's not the blood you spill that gets you what you want, it's the blood you share. Your family, your friendships, your community, these are the most valuable things a man can have." Before Dishonor - Hatebreed
The Future of Health Care?
Reply #25 on:
June 01, 2014, 12:43:58 AM »
"You see, it's not the blood you spill that gets you what you want, it's the blood you share. Your family, your friendships, your community, these are the most valuable things a man can have." Before Dishonor - Hatebreed
Reply #26 on:
June 11, 2014, 07:30:06 AM »
I didn't realize paper was invented by the Chinese between 100 and 200 AD. Of course it really took off after the invention of type and printing press. Martin Luther was the first bestseller ( a bit before the NYT):
Re: Technology, MIT Robot Cheetah Has Evolved and Can Now Run Free
Reply #27 on:
September 16, 2014, 09:50:22 AM »
"Our robot can be silent and as efficient as animals. The only things you hear are the feet hitting the ground," said Sangbae Kim, associate professor of mechanical engineering at MIT. "This is kind of a new paradigm where we're controlling force in a highly dynamic situation. Any legged robot should be able to do this in the future."
Reply #28 on:
September 21, 2014, 11:21:09 AM »
apple pay vs currentc; demise of visa and mastercard?
Reply #29 on:
October 26, 2014, 11:15:03 AM »
CVS Follows Rite Aid, Shuts Off Apple Pay
By Paul Ausick October 26, 2014 8:32 am EDT
Last Thursday drug store chain Rite Aid Inc. (NYSE: RAD) reportedly stopped accepting payments made through the just launched Apple Pay system from Apple Inc. (NASDAQ: AAPL). On Saturday, CVS Health Corp. (NYSE: CVS) was reported to have followed suit at its CVS pharmacy stores.
The issue appears to be a conflict between Apple Pay and a mobile payment system called CurrentC that is being developed by a retailer-owned mobile technology outfit called Merchant Customer Exchange (MCX). Unlike Apple Pay, CurrentC does not use an NFC chip, but instead generates a QR code that is displayed on the merchant’s checkout terminal. Customers who have already linked their bank accounts to the CurrentC system scan the QR code from the terminal and the transaction is completed.
When Apple announced Apple Pay in early September, both Wal-Mart Stores Inc. (NYSE: WMT) and Best Buy Co. Inc. (NYSE: BBY) said they had no plans to adopt the new system. Both are partners in MCX along with other major retailers like Target Corp. (NYSE: TGT), Darden Restaurants Inc. (NYSE: DRI), and Sears Holdings Corp. (NASDAQ: SHLD).
MCX has been working on a mobile payment solution since 2011, and the driving force behind the effort is to enable the merchants to avoid paying the 2% to 3% credit card transaction fees charged by the likes of Visa Inc. (NYSE: V) and MasterCard Inc. (NYSE: MA). How much do these big retailers dislike paying fees to Visa and MasterCard? Former Walmart CEO Lee Scott is reported to have said, “I don’t know that MCX will succeed, and I don’t care. As long as Visa suffers.”
That kind of attitude ought to help drive adoption of Apple Pay as well, but retailers have an investment in CurrentC and the system has begun real-world testing and is scheduled to go live early next year. The advantage of CurrentC is that it works with existing checkout terminals, while Apple Pay requires that most retailers purchase new equipment to communicate with the NFC chip in the iPhone 6 and 6 Plus. Among the retailers that do not need to buy new terminals are Best Buy, Rite Aid, and CVS, so their reason for shutting down access to Apple Pay is very likely contractual or an act of solidarity with their fellow CurrentC backers.
CVS and the other CurrentC companies will almost certainly use the system exclusively for a relatively short time. If, as most observers expect, customer demand for NFC-based systems like Apple Pay grows rapidly, these retailers are not going to adopt a “my way or the highway” attitude with their customers. They have learned that when it comes to technology, it’s a consumer-driven world and they just live in it. And one other thing retailers have — or should have learned — is not to underestimate the power of Apple in the consumer world.
ALSO READ: The 20 Most Profitable Companies in the World
By Paul Ausick
Read more: CVS Follows Rite Aid, Shuts Off Apple Pay - Apple Inc. (NASDAQ:AAPL) - 24/7 Wall St.
Follow us: @247wallst on Twitter | 247wallst on Facebook
Nano tech detection of cancer
Reply #30 on:
October 29, 2014, 01:58:37 PM »
POTH: Robots taking more and more human jobs
Reply #31 on:
December 16, 2014, 07:58:54 AM »
Stratfor: Quantum Computing
Reply #32 on:
July 24, 2015, 07:06:41 AM »
Approaching a Quantum Leap in Computing
July 24, 2015 | 09:01 GMT
A D-Wave Systems chip designed to operate as a 128-qubit superconducting adiabatic quantum optimization processor. (D-Wave Systems, Inc.)
The widespread use of quantum computers in industry is likely only a decade or two away.
The United States will probably maintain its lead in the field, though China will be competitive.
The countries and companies that first access quantum computers will enjoy a powerful advantage over their peers in areas that stand to gain from the technology.
Quantum computers, or computers based on the principles of quantum mechanics, stand to exponentially increase computing power within the next two decades. Though the scientific community is still fiercely debating the very nature of quantum mechanics itself, and numerous technical obstacles stand in the way of applying the principles of quantum mechanics to machines, the field is rapidly developing.
Now, the widespread use of quantum computers in industry is likely only a decade or two away. Such devices will be far more powerful than even the most powerful supercomputers seen today, carrying significant implications for national security, cyberwarfare and intelligence operations, among many other things. Just how powerful quantum computers can be — and how their adoption could lead to another revolution in computer-related technologies — becomes clear when we consider their computing power. Using a quantum computer to solve a problem can loosely be thought of as trying all possible solutions at once, whereas using a classical solution would mean trying them in sequential order. The expansion in computing power gained by incorporating quantum mechanics principles into computing could prove to be as revolutionary to computer science as research in physics and electromagnetism has proved to modern electronics.
Quantum Mechanics: A Primer
The field of quantum mechanics arose from German physicist Max Planck's attempts to describe the spectrum of light emitted by hot bodies. Specifically, he wondered what accounted for the shift in color from red to yellow to blue as the temperature of a flame increased. Planck devised an equation explaining what he had observed, based on the assumption that matter behaved differently at the atomic and subatomic levels.
Though even the great German physicist questioned this assumption, his research kicked off 30 years of scientific inquiry that yielded the theories and discoveries that form the basis of today's understanding of physics and chemistry. Albert Einstein introduced one of quantum mechanics' most famous and perplexing concepts just five years or so after Planck devised his equation, extending the latter's assumption by asserting that a quantum of light, or a photon, behaves as both a wave and a particle. This duality, along with the many other dualities embedded in quantum mechanics, became the bedrock of the field.
Click to enlarge
Today, scientists still debate how to interpret quantum mechanics. Perhaps the most widely held approach is called the Copenhagen interpretation, which holds that every quantum particle, known as a "cat," exists in all of its possible states at once until it is measured; only when it is observed does the particle exist in one state. This concept has become known as the principle of superposition.
The superposition principle is one of the fundamental features of "quantum bits" or "qubits," the quantum computer's equivalent to the bits of classical computers. Classical computing relies on data comprising numerous individual bits that can only exist in one of two states, 0 or 1. Computers process data composed of long ordered strings of 0s and 1s. Today's computer chips are made up of millions of transistors and capacitors that can only exist as a 0 or 1; while switching these states now takes a mere fraction of a millisecond (a period that is shrinking every day), there are still natural limits to how fast data can be processed and how small transistors and capacitors can be shrunk.
A qubit has the advantage of being able to be a 0, a 1, and a superposition of both 0 and 1 — that is, it can exist in all possible states. This allows quantum computers to exist simultaneously in all possible states, whereas a classical computer could only exist in them sequentially. This means that a quantum computer can perform vast numbers of calculations at the same time, and that the power of a quantum computer increases exponentially as the number of qubits increases.
An additional boost to the potential power of quantum computers comes from the concept of "quantum entanglement," which Einstein famously described as "spooky action at a distance." Quantum entanglement is the principle that some quantum systems' states cannot be described by the states of their individual elements alone because those elements may be "entangled;" in other words, different elements' states are related to one another in some way, meaning that what happens to one will affect the other, no matter how vast the distance separating the two. Among other things, quantum entanglement can be used to create "super-dense" coding in which two classical bits can be encoded and transmitted via one qubit.
Though quantum computers will have a broad impact on society, the most obvious areas that stand to benefit are the ones that supercomputers dominate today: cryptography, research and military applications. The most well-known capability quantum computers could unlock would be the use of what is known as Shor's algorithm, something classical computers cannot do and a tool of significant interest to the National Security Agency, CIA and the Chinese government.
In short, Shor's algorithm would enable the breaking of complex codes by speeding up the search for a given number's prime factors, the backbone of modern-day encryption methods. The gains that would be made by using a quantum computer to break a code over a classical computer are gigantic: A quantum computer can do in minutes or hours what a classical computer would take years or much longer to do. Of course, the floodgates of stored data will not suddenly open once Shor's algorithm comes into play; quantum computers could also be use to encrypt information far more securely than is possible with classical computers, something already under intense study.
Outside of the military and intelligence spheres, quantum computers would greatly expand data processing and permit the simulation of almost every natural phenomenon. They would also lead to outcomes such as the faster development of new drugs and more accurate weather forecasting, as well as those as exotic as the search for extraterrestrial life and the development of artificial intelligence.
Quantum computing would have important implications for the development of artificial intelligence because it would expand machine-learning algorithms. Today's algorithms rely on pattern recognition; with quantum computing, machines could adapt to anomalous situations. A highly refined machine-learning algorithm would help automated systems handle non-routine tasks, an area that has been lacking in the automation and digitization of jobs and that would improve upon the current research on autonomous cars, robots and drones.
Developing Quantum Computers
Building a quantum computer is no easy task. Still, the past five years have seen significant progress toward the development of an economical quantum computing machine and its components, though the industry remains in its infancy. The problem of preserving and storing qubits lies at the heart of the challenge: A qubit in a superposition state is quite fragile. Its interaction with other particles (whether qubits or otherwise) essentially forces it to collapse into one state or the other (e.g., a 0 or a 1).
Physicists have tried to preserve qubits by supercooling their environment to temperatures just above absolute zero (-273.15 degrees Celsius) and using them in a vacuum. But for nearly all practical purposes outside of research environments and possibly a few government agencies, quantum computers would need to exist at ambient temperatures. The record for storing quantum data at room temperature, set in 2013, is a mere 39 minutes (an improvement upon the previous record of 2 seconds). Even with its prodigious computing power, a qubit needs more time than that to perform meaningful calculations. Of course, classical computers once faced similar challenges. Like today's quantum computers, the classical computers of the 1950s filled rooms, and the idea of shrinking them down to the size of the device you are using to read this article was a distant prospect.
All challenges aside, there has been no shortage of interest in researching technologies for quantum computers. Established technology firms, defense contractors, intelligence agencies and startups, among many others, are pursuing them. In fact, Canadian startup D-Wave Systems, Inc. has already begun selling the first commercial quantum computer, unveiling a 1,000-qubit version of the D-Wave Two in June. The company is collaborating with Google, Lockheed Martin and NASA to develop quantum computers further.
Naturally, D-Wave has found the technical challenges it faces daunting. Its computers have come under heavy criticism for their inflexibility: The D-Wave processor is designed to perform optimization tasks and little else. More fundamentally, some have questioned whether the D-Wave system actually relies on quantum mechanics. Some physicists and IBM have argued that classical computers are capable of performing the same functions and tasks that D-Wave's system does.
For its part, IBM has made its own recent breakthroughs in quantum computer development. In April, IBM researchers published a paper describing a method of simultaneously detecting both an error common to all computers and an error unique to quantum computers. The first is a "bit-flip error," where a 0 accidentally flips to a 1, or vice versa, while the second is a "sign-flip error," where the relationship between 0 and 1 flips. Previous research attempts could not detect both errors at the same time.
Though other countries share the United States' keen interest in developing quantum computers, none appears able to supplant the United States as the global leader in the field. China is likely the only other country with the financial power as well as the military and national security motivations to explore quantum computers and their properties, and Beijing has dedicated significant funds to such research. The Chinese have already shown the ability to perform as well as their American counterparts in developing supercomputers, though China has lagged behind the United States in the commercialization of domestically developed and designed classical computers and their components. But with commercially available quantum computers still decades away, China's interest in the technology, at this point, is mainly strategic. With its successes in supercomputing, China could conceivably develop a fully functioning quantum computer before the United States, though whether it could develop a commercially viable model before U.S. companies do is more doubtful.
Despite the extensive interest quantum computers are generating worldwide, they will not replace classical computers anytime soon; even their adoption among niche customers remains at least a decade away, if not two. Instead, the spread of quantum computers will likely occur in the same sort of slow, methodical manner seen with the adoption of classical computers.
Today's quantum computers, the D-Wave One and the D-Wave Two, are highly refined machines that have been designed to perform one task only: optimization. The development of similarly specialized machines that focus on solving a single problem, whether factorization, simulations or moving traffic efficiently, will continue. These quantum computers will compete with the supercomputers that are currently being developed and optimized to perform similar specific tasks. Government agencies, as well as companies involved in relevant security-related applications such as cryptography, will be satisfied with quantum computers that can perform only one task, just as they are satisfied with supercomputers that are likewise specialized.
However, the development and possible commercialization of a more practical and universal quantum computer remains a distant goal, even though companies like Google are aiming for it now. The adoption of single-task (and, later, universal) quantum computers will be linear rather than exponential, and some industries will adopt them more quickly than others. But just as oil supermajors will use their powers of simulation to unlock more oil reserves, the countries and companies that are the first to access quantum computers will enjoy a powerful advantage over their competitors in areas that can make use of the technology.
Reply #33 on:
August 07, 2015, 08:55:46 AM »
Aug. 6, 2015 7:55 p.m. ET
The companies that make 3-D printers are struggling with an increasingly uncertain outlook as many potential buyers hold out for what they hope will be better, faster machines.
3D Systems Corp. , one of the biggest 3-D printer producers, on Thursday reported second-quarter sales and profit that fell short of expectations.
The Rock Hill, S.C., company said it swung to a net loss of $13.7 million in the period, from $2.1 million in profit a year earlier. Overall revenue rose, but organic sales, which exclude revenue from acquisitions and exchange-rate changes, slid 5%.
The results came after rival Stratasys Ltd. last week said its loss widened in the latest quarter to $22.9 million from $173,000 a year earlier, while its sales of printers and production materials fell 13%.
Both Stratasys and 3D Systems, which together accounted for more than a third of industry sales last year, also have discarded their guidance for the year.
From the digital file to the finished object, a primer on simple, desktop 3-D printing.
The move is an unanticipated slump in an industry where sales had been increasing around 34% annually for the past three years.
Both companies have acknowledged suffering quality and reliability problems with their printers as sales have increased rapidly and the companies absorbed acquisitions in recent years.
The expected household market for printers hasn’t caught on, despite widespread availability of printers at big-box retailers for under $5,000. Less than 10% of Stratasys’s annual sales come from home users, according to an analyst estimate.
“We felt all along there isn’t much of a consumer market for these machines,” said Terry Wohlers, a 3-D printer market consultant in Fort Collins, Colo. “They’re not easy to use. A lot of them in homes are sitting there collecting dust.”
Some analysts believe businesses, given tight budgets, are delaying purchases because of the limitations of the current models, and in anticipation that Hewlett-Packard Co. and other heavyweights from the two-dimensional printer industry could enter the 3-D market in the next year or two with faster-running, more reliable printers.
Is 2015 the year that 3-D printing goes mainstream? Deloitte & Touche’s Craig Wigginton discusses the outlook for 3-D printers with Sara Murray. Photo: Getty
“We’ve gone through an early adopter phase where [companies] bought printers to convey innovation,” said Brian Drab, an analyst for William Blair & Co. “We’re going into more mainstream adoption where you’re going to look silly if you make a capital investment in a printer that runs at 5% of the speed that’s coming onto the market. Why not wait?”
Industry executives have acknowledged that they are in a sluggish phase. “Our industry is now transforming through a period of slower growth as users digest recent investments in 3-D printing,” Stratasys CEO David Reis told analysts last week.
The 3-D printing process slices a digital image of an object into thousands of layers, which the printers then recreate one at a time in plastic, metal, sand or other materials.
The process can be applied to myriad objects and, theoretically, reduce complicated manufacturing processes to some key strokes on a home computer.
But while 3-D’s outsize potential stoked brisk demand for printers in recent years, the devices largely have settled into niche roles producing items like engineering prototypes; small volumes of hard-to-build components, and custom-made objects like hearing aids or dental appliances.
WSJ.D is the Journal’s home for tech news, analysis and product reviews.
Can a New ‘Do Not Track’ Proposal Save Web Privacy?
IBM Adds to Its Watson Health Service
Hackers Take Control of a Tesla, Sort Of
Unease in TV Land: Cord-Cutting Worries Hammer Media Firms
3-D printers remain too slow to supplant metal-cutting machines and plastic injection equipment widely used for high-volume manufacturing.
Theodore Ullrich, founder of Tomorrow Lab LLC, a product design consultancy in New York, said the firm’s two desktop 3-D printers are “always in use” producing plastic prototypes, but that parts break regularly and the printers require frequent adjustments to keep them in alignment.
“They’re fast and effective” for prototyping, he said. “It might be worth getting a new one, but we don’t know the reliability of the new ones yet.”
H-P has announced plans to begin selling a 3-D printer next year that applies powered plastic with jet sprayers adapted from its current ink printers. The company predicts its multi-jet fusion printer will be 10 times faster than other printers on the market because the plastic feedstock would be delivered from multiple outlets.
3D Systems said it is working to correct quality and performance problems with its printers, particularly those that produce metal objects.
“We’re disappointed with our results,” said 3D Systems Chief Executive Avi Reichental during a conference call Thursday. “We haven’t been able to fully remediate everything on the large-frame metal printers. But we’re making good progress.”
3D Systems said overall revenue, including sales from its service business, rose 12.5% to $170.5 million. Excluding special items, the company reported earnings per share of 3 cents, while analysts were expecting 8 cents from revenue of $171.6 million.
The company’s shares, which have slumped, jumped 16.2% to $13.60 Thursday. But that left them just slightly above their level a week earlier. So far this year, they have fallen 59%, while shares in Stratasys are down 63%.
Write to Bob Tita at
A 3D printed car!
Reply #34 on:
August 15, 2015, 08:10:00 AM »
Please select a destination:
DBMA Martial Arts Forum
=> Martial Arts Topics
Politics, Religion, Science, Culture and Humanities
=> Politics & Religion
=> Science, Culture, & Humanities
=> Espanol Discussion
Powered by SMF 1.1.19
SMF © 2013, Simple Machines