Rss

Archives for : gennaio2014

Facing the Intelligence Explosion: There is Plenty of Room Above

See on Scoop.itAmazing Science

Why are AIs in movies so often of roughly human-level intelligenceOne reason is that we almost always fail to see non-humans as non-human. We anthropomorphize. That’s why aliens and robots in fiction are basically just humans with big eyes or green skin or some special power. Another reason is that it’s hard for a writer to write characters that are smarter than the writer. How exactly would a superintelligent machine solve problem X?


The human capacity for efficient cross-domain optimization is not a natural plateau for intelligence. It’s a narrow, accidental, temporary marker created by evolution due to things like the slow rate of neuronal firing and how large a skull can fit through a primate’s birth canal. Einstein may seem vastly more intelligent than a village idiot, but this difference is dwarfed by the difference between the village idiot and a mouse.


As Vernor Vinge put it: The best answer to the question, “Will computers ever be as smart as humans?” is probably “Yes, but only briefly.”[1 How could an AI surpass human abilities? Let us count the ways:


  • Speed. Our axons carry signals at seventy-five meters per second or slower. A machine can pass signals along about four million times more quickly.
  • Serial depth. The human brain can’t rapidly perform any computation that requires more than one hundred sequential steps; thus, it relies on massively parallel computation.[2More is possible when both parallel and deep serial computations can be performed.
  • Computational resources. The brain’s size and neuron count are constrained by skull size, metabolism, and other factors. AIs could be built on the scale of buildings or cities or larger. When we can make circuits no smaller, we can just add more of them.
  • Rationality. As we explored earlier, human brains do nothing like optimal belief formation or goal achievement. Machines can be built from the ground up using (computable approximations of) optimal Bayesian decision networks, and indeed this is already a leading paradigm in artificial agent design.
  • Introspective access/editability. We humans have almost no introspective access to our cognitive algorithms, and cannot easily edit and improve them. Machines can already do this (read about EURISKO and metaheuristics). A limited hack like the method of loci greatly improves human memory; machines can do this kind of thing in spades.

REFERENCES:

1Vernor Vinge, “Signs of the Singularity,” IEEE Spectrum, June 2008, http://spectrum.ieee.org/biomedical/ethics/signs-of-the-singularity.

2J. A. Feldman and Dana H. Ballard, “Connectionist Models and Their Properties,” Cognitive Science 6 (3 1982): 205–254, doi: 10.1207/s15516709cog0603_1.


See on intelligenceexplosion.com

Single-cell technologies for monitoring immune systems : Nature Immunology

See on Scoop.itVirology and Bioinformatics from Virology.ca

The complex heterogeneity of cells, and their interconnectedness with each other, are major challenges to identifying clinically relevant measurements that reflect the state and capability of the immune system. Highly multiplexed, single-cell technologies may be critical for identifying correlates of disease or immunological interventions as well as for elucidating the underlying mechanisms of immunity. Here we review limitations of bulk measurements and explore advances in single-cell technologies that overcome these problems by expanding the depth and breadth of functional and phenotypic analysis in space and time. The geometric increases in complexity of data make formidable hurdles for exploring, analyzing and presenting results. We summarize recent approaches to making such computations tractable and discuss challenges for integrating heterogeneous data obtained using these single-cell technologies.


See on www.nature.com

Researchers achieve fastest real-world fiber speeds of 1.4Tb/s

See on Scoop.itPhysics

Alcatel-Lucent and BT have today announced trial speeds of up to 1.4Tb/s with a record spectral efficiency of 5.7 bits per second per Hertz (b/s/Hz)on an existing core fiber connection.
See on phys.org

Drug Companies Could Use EHR Systems for Targeted Marketing

See on Scoop.ithealthcare technology

Pharmaceutical companies increasingly are using electronic health records to analyze patient data and market their products to consumers and physicians through advertisements and email campaigns.

Electronic health record systems could be used by pharmaceutical companies to market their products to physicians and consumers,Reuters reports.

Pharmaceutical companies historically have gathered patients’ de-identified data from insurers, pharmacies and public records to improve their marketing strategies.

However, drug companies can collect and analyze data through EHR systems and use that information to reach out to consumers and doctors.


See on www.ihealthbeat.org

Metastatic cancer cells implode on protein contact using E-selecting and TRAIL

See on Scoop.itAmazing Science

By attaching a cancer-killer protein to white blood cells, Cornell biomedical engineers have demonstrated the annihilation of metastasizing cancer cells traveling throughout the bloodstream.

The study, “TRAIL-Coated Leukocytes that Kill Cancer Cells in the Circulation,” was published online the week of Jan. 6 in the journal Proceedings of the National Academy of Sciences.

“These circulating cancer cells are doomed,” said Michael King, Cornell professor of biomedical engineering and the study’s senior author. “About 90 percent of cancer deaths are related to metastases, but now we’ve found a way to dispatch an army of killer white blood cells that cause apoptosis – the cancer cell’s own death – obliterating them from the bloodstream. When surrounded by these guys, it becomes nearly impossible for the cancer cell to escape.”

King and his colleagues injected human blood samples, and later mice, with two proteins: E-selectin (an adhesive) and TRAIL (Tumor Necrosis Factor Related Apoptosis-Inducing Ligand). The TRAIL protein joined together with the E-selectin protein stick to leukocytes – white blood cells – ubiquitous in the bloodstream. When a cancer cell comes into contact with TRAIL, which becomes unavoidable in the chaotic blood flow, the cancer cell essentially kills itself.

“The mechanism is surprising and unexpected in that this repurposing of white blood cells in flowing blood is more effective than directly targeting the cancer cells with liposomes or soluble protein,” say the authors.

In the laboratory, King and his colleagues tested this concept’s efficacy. When treating cancer cells with the proteins in saline, they found a 60 percent success rate in killing the cancer cells. In normal laboratory conditions, the saline lacks white blood cells to serve as a carrier for the adhesive and killer proteins. Once the proteins were added to flowing blood, which models forces, mixing and other human-body conditions, however, the success rate in killing the cancer cells jumped to nearly 100 percent.

In addition to King, the paper’s researchers include first author Michael Mitchell, a Cornell doctoral candidate in the field of biomedical engineering; Elizabeth C. Wayne, a Cornell doctoral student in the field of biomedical engineering; Kuldeepsinh Rana, a Cornell Ph.D. ’11; and Chris Schaffer, associate professor in biomedical engineering. The National Cancer Institute (Physical Sciences-Oncology program) of the National Institutes of Health, Bethesda, Md. funded the research through Cornell’s Center for the Microenvironment and Metastasis.

Metastasis is the spread of a cancer cells to other parts of the body. Surgery and radiation are effective at treating primary tumors, but difficulty in detecting metastatic cancer cells has made treatment of the spreading cancer problematic, say the scientists.


See on www.news.cornell.edu

Genomics, Medicine, and Pseudoscience: The top 6 vitamins and supplements you shouldn’t take

See on Scoop.itVirology and Bioinformatics from Virology.ca


See on genome.fieldofscience.com

Supercomputer models one second of human brain activity

See on Scoop.ithealthcare technology

The most accurate simulation of the human brain to date has been carried out in a Japanese supercomputer, with a single second’s worth of activity from just one per cent of the complex organ taking one of the world’s most powerful supercomputers 40 minutes to calculate.

Researchers used the K computer in Japan, currently the fourth most powerful in the world, to simulate human brain activity. The computer has 705,024 processor cores and 1.4 million GB of RAM, but still took 40 minutes to crunch the data for just one second of brain activity.

The project, a joint enterprise between Japanese research group RIKEN, the Okinawa Institute of Science and Technology Graduate University and Forschungszentrum Jülich, an interdisciplinary research center based in Germany, was the largest neuronal network simulation to date.

It used the open-source Neural Simulation Technology (NEST) tool to replicate a network consisting of 1.73 billion nerve cells connected by 10.4 trillion synapses.

While significant in size, the simulated network represented just one per cent of the neuronal network in the human brain. Rather than providing new insight into the organ the project’s main goal was to test the limits of simulation technology and the capabilities of the K computer.


See on www.telegraph.co.uk

Characterization of influenza vaccine immunogenicity using influenza antigen microarrays. [PLoS One. 2013]

See on Scoop.itVirology and Bioinformatics from Virology.ca
AbstractBACKGROUND:

Existing methods to measure influenza vaccine immunogenicity prohibit detailed analysis of epitope determinants recognized by immunoglobulins. The development of highly multiplex proteomics platforms capable of capturing a high level of antibody binding information will enable researchers and clinicians to generate rapid and meaningful readouts of influenza-specific antibody reactivity.

METHODS:

We developed influenza hemagglutinin (HA) whole-protein and peptide microarrays and validated that the arrays allow detection of specific antibody reactivity across a broad dynamic range using commercially available antibodies targeted to linear and conformational HA epitopes. We derived serum from blood draws taken from 76 young and elderly subjects immediately before and 28±7 days post-vaccination with the 2008/2009 trivalent influenza vaccine and determined the antibody reactivity of these sera to influenza array antigens.

RESULTS:

Using linear regression and correcting for multiple hypothesis testing by the Benjamini and Hochberg method of permutations over 1000 resamplings, we identified antibody reactivity to influenza whole-protein and peptide array features that correlated significantly with age, H1N1, and B-strain post-vaccine titer as assessed through a standard microneutralization assay (p<0.05, q <0.2). Notably, we identified several peptide epitopes that were inversely correlated with regard to age and seasonal H1N1 and B-strain neutralization titer (p<0.05, q <0.2), implicating reactivity to these epitopes in age-related defects in response to H1N1 influenza. We also employed multivariate linear regression with cross-validation to build models based on age and pre-vaccine peptide reactivity that predicted vaccine-induced neutralization of seasonal H1N1 and H3N2 influenza strains with a high level of accuracy (84.7% and 74.0%, respectively).

CONCLUSION:

Our methods provide powerful tools for rapid and accurate measurement of broad antibody-based immune responses to influenza, and may be useful in measuring response to other vaccines and infectious agents.

  
See on www.ncbi.nlm.nih.gov

Materials Genomics: Using supercomputers in the hunt for novel alloys and compounds

See on Scoop.itAmazing Science

In a new study, researchers from Duke University’s Pratt School of Engineering used computational methods to identify dozens of platinum-group alloys that were previously unknown to science but could prove beneficial in a wide range of applications.

Platinum is expensive, but it’s used to transform toxic fumes leaving a car’s engine into more benign gasses, to produce high octane gasoline, plastics and synthetic rubbers, and to fight the spread of cancerous tumors.

“We’re looking at the properties of ‘expensium’ and trying to develop ‘cheapium,’” said Stefano Curtarolo, director of Duke’s Center for Materials Genomics. “We’re trying to automate the discovery of new materials and use our system to go further faster.”

The research is part of the Materials Genome Initiative launched by President Barack Obama in 2011. The initiative’s goal is to support centers, groups and researchers in accelerating the pace of discovery and deployment of advanced material systems crucial to achieving global competitiveness in the 21st century.

The identification of the new platinum-group compounds hinges on databases and algorithms that Curtarolo and his group have spent years developing. Using theories about how atoms interact to model chemical structures from the ground up, Curtarolo and his group screened thousands of potential materials for high probabilities of stability.

After nearly 40,000 calculations, the results identified 37 new binary alloys in the platinum-group metals, which include osmium, iridium ruthenium, rhodium, platinum and palladium.

These metals are prized for their catalytic properties, resistance to chemical corrosion and performance in high-temperature environments, among other properties. Commercial applications for the group include electrical components, corrosion-resistance apparatus, fuel cells, chemotherapy and dentistry. And because of their worldwide scarcity, each metal fetches a premium price.

Now it is up to experimentalists to produce these new materials and discover their physical properties. In addition to identifying unknown alloys, the study also provides detailed structural data on known materials. For example, there are indications that some may be structurally unstable at low temperatures. This isn’t readily apparent because creating such materials is difficult, requiring high temperatures or pressures and very long equilibration processes.


“We hope providing a list of targets will help identify new compounds much faster and more cheaply,” said Curtarolo. “Physically going through these potential combinations just to find the targets would take 200 to 300 graduate students five years. As it is, characterizing the targets we identified should keep the experimentalists busy for 20.” 


See on www.kurzweilai.net

Human Genome Shrinks To Only 19,000 genes

See on Scoop.itFragments of Science

Genes are the fundamental units of inheritance in living organisms. Together, they hold all the information necessary to reproduce a given organism and to pass on genetic traits to its offspring.

 

Biologists have long debated what constitutes a gene in molecular terms but one useful definition is a region of DNA that carries that code necessary to make a molecular chain called a polypeptide. These chains link together to form proteins and so are the bricks and mortar out of which all organism are constructed.

 

Given this crucial role, it is no surprise that an ongoing goal in biology is to work out the total number of protein-coding genes necessary to construct a given organism. Biologists think the yeast genome contains about 5300 coding genes and a nematode worm genome contains about 20,470.

 


See on medium.com

%d blogger cliccano Mi Piace per questo: