Posts Tagged ‘data’

h1

Don’t Forget the Brain Is as Complex as All the World’s Digital Data

October 22, 2013

Twenty years ago, sequencing the human genome was one of the most ambitious science projects ever attempted. Today, compared to the collection of genomes of the microorganisms living in our bodies, the ocean, the soil and elsewhere, each human genome, which easily fits on a DVD, is comparatively simple. Its 3 billion DNA base pairs and about 20,000 genes seem paltry next to the roughly 100 billion bases and millions of genes that make up the microbes found in the human body.

And a host of other variables accompanies that microbial DNA, including the age and health status of the microbial host, when and where the sample was collected, and how it was collected and processed. Take the mouth, populated by hundreds of species of microbes, with as many as tens of thousands of organisms living on each tooth. Beyond the challenges of analyzing all of these, scientists need to figure out how to reliably and reproducibly characterize the environment where they collect the data.

“There are the clinical measurements that periodontists use to describe the gum pocket, chemical measurements, the composition of fluid in the pocket, immunological measures,” said David Relman, a physician and microbiologist at Stanford University who studies the human microbiome. “It gets complex really fast.”

Excerpt from an article by Emily Singer at Quanta. Continue THERE

h1

When Memorization Gets in the Way of Learning A teacher’s quest to discourage his students from mindlessly reciting information.

September 14, 2013

Some things are worth memorizing–addresses, PINs, your parents’ birthdays. The sine of π/2 is not among them. It’s a fact that matters only insofar as it connects to other ideas. To learn it in isolation is like learning the sentence “Hamlet kills Claudius” without the faintest idea of who either gentleman is–or, for what matter, of what “kill” means. Memorization is a frontage road: It runs parallel to the best parts of learning, never intersecting. It’s a detour around all the action, a way of knowing without learning, of answering without understanding.

Memorization has enjoyed a surge of defenders recently. They argue that memorization exercises the brain and even fuels deep insights. They say our haste to purge old-school skills-driven teaching from our schools has stranded a generation of students upriver without a paddle. They recommend new apps aiming to make drills fun instead of tedious. Most of all, they complain that rote learning has become taboo, rather than accepted as a healthy part of a balanced scholastic diet.

Excerpt from an article written by BEN ORLIN at The Atlantic. Continue THERE

h1

The 21st Century Will Be Defined By Games, a Manifesto.

September 10, 2013

Previous centuries have been defined by novels and cinema. In a bold manifesto we’re proud to debut here on Kotaku, game designer Eric Zimmerman states that this century will be defined by games.ore

Below is Zimmerman’s manifesto, which will also appear in the upcoming book The Gameful World from MIT press. We invite you to read it, to think about it and even to annotate it. Zimmerman’s manifesto is followed by an exploration of the ideas behind it, in an essay by author and professor Heather Chaplin. In the days to come, we’ll be expanding the discussion even further with perspectives from other gamers and game-thinkers. But let’s start with the big ideas. Let’s start with a manifesto by gamers, about games, for the world we live in…

Games are ancient.

Digital technology has given games a new relevance.

The 20th Century was the century of information.

In our Ludic Century, information has been put at play.

In the 20th Century, the moving image was the dominant cultural form.

The Ludic Century is an era of games.

We live in a world of systems.

There is a need to be playful.

We should think like designers.

Games are a literacy.

Gaming literacy can address our problems.

In the Ludic Century, everyone will be a game designer.

Games are beautiful. They do not need to be justified.

Expand on each of these claims HERE

h1

Medical Stereograms (Crossview)

April 11, 2013

Images based on CT and MRI data, to be viewed in crossview technique. See entire gallery HERE

h1

Female Orgasm in Brodmann Brain Regions

July 23, 2012

The human brain can be separated into regions based on structure and function – vision, audition, body sensation, etc, known as Brodmann’s area map.

This animation shows the functional magnetic resonance imaging, fMRI, brain data of a participant experiencing an orgasm and the corresponding relationships seen within these different regions based on utilization of oxygen levels in the blood. 20 snapshots in time of the fMRI data are taken from a 7 minute sequence. Over the course of the 7 minutes the participant approaches orgasm, reaches orgasm and then enters a quiet period.

Oxygen utilization levels are displayed on a spectrum from dark red (lowest activity) to yellow/white (highest). As can be observed, an orgasm leads to almost the entire brain illuminating yellow, indicating that most brain systems become active at orgasm.

Text and Image by The Visual MD

Via The Guardian

h1

Informed consent: A broken contract

June 24, 2012

Late in May, the direct-to-consumer gene-testing company 23andMe proudly announced the impending award of its first patent. The firm’s research on Parkinson’s disease, which used data from several thousand customers, had led to a patent on gene sequences that contribute to risk for the disease and might be used to predict its course. Anne Wojcicki, co-founder of the company, which is based in Mountain View, California, wrote in a blog post that the patent would help to move the work “from the realm of academic publishing to the world of impacting lives by preventing, treating or curing disease”.

Some customers were less than enthusiastic. Holly Dunsworth, for example, posted a comment two days later, asking: “When we agreed to the terms of service and then when some of us consented to participate in research, were we consenting to that research being used to patent genes? What’s the language that covers that use of our data? I can’t find it.”

The language is there, in both places. To be fair, the terms of service is a bear of a document — the kind one might quickly click past while installing software. But the consent form is compact and carefully worded, and approved by an independent review board to lay out clearly the risks and benefits of participating in research. “If 23andMe develops intellectual property and/or commercializes products or services, directly or indirectly, based on the results of this study, you will not receive any compensation,” the document reads.

Excerpt from an article written by Erika Check Hayden, at Nature. Continue HERE

h1

Turning Scientific Perplexity into Ordinary Statistical Uncertainty

June 4, 2012



To show how statistical models are built, the authors of Principles of Applied Statistics use a study on the relation between diabetes control and efforts to explain the disease to patients. The relevant variables (a) are baseline variables such as education, gender and duration of disease; attribution (how individuals conceive their responsibility in managing and treating the disease); knowledge about the disease; and the outcome, a measure of how successfully individuals control glucose levels. Defining the relation between each pair of variables creates a regression chain, a sequence in which the variables in a given box depend on all those in the preceding boxes. After analyzing the data, a simpler version (b) can be proposed: Control of glucose levels depends on knowledge about the disease, which depends on baseline variables; baseline variables also affect glucose control directly. Probability, the authors write, “is used to represent, possibly in highly idealized form, a phenomenon in the real world. As such it is not essentially different from concepts like mass, force and energy.”

D. R. Cox published his first major book, Planning of Experiments, in 1958; he has been making major contributions to the theory and practice of statistics for as long as most current statisticians have been alive. He is now in a reflective phase of his career, and this book, coauthored with the distinguished biostatistician Christl A. Donnelly, is a valuable distillation of his experience of applied work. It stands as a summary of an entire tradition of using statistics to address scientific problems.

Excerpt from an text by Cosma Shalizi, at American Scientist. Continue HERE