# Day 0 : introduction: what is an image?¶

February 13 2015

Download day0 slides

## Homework¶

Install requirements on your machine – see Software to install for the class.

## An introduction to the course.¶

• Replication crisis;
• The ubiquity of error;
• Scientists as coders;
• Best practice;
• The benefits of fluency;
• Simple first, then easy;
• Why Python?;
• Why git?

• Around 30 minutes homework;

• Homework ready by end of day Monday for following Friday course;

• Class consists of:

• 5 minute debrief from previous class;
• 30 minute talk and introduction to problems;
• 60 minutes working on problems;
• 10 minutes reviewing.
• Office hours;

## Introduction to the exercises¶

• Open terminal (Mac, Linux) or Git Bash (Windows);
• Make a directory pna with mkdir pna;
• Change directory to pna with cd pna;
• git clone git://github.com/practical-neuroimaging/pna2015.git
• cd pna2015
• ls

You should see a directory listed called day0.

• cd day0
• Start the IPython notebook with ipython notebook
• select and open “what_is_an_image” notebook.

## Quotes from David Donoho¶

Donoho quote 0:

“The scientific method’s central motivation is the ubiquity of error – the awareness that mistakes and self-delusion can creep in absolutely anywhere and that the scientist’s effort is primarily expended in recognizing and rooting out error.” [donoho2009reproducible]

Donoho quote 1:

“In stark contrast to the sciences relying on deduction or empiricism, computational science is far less visibly concerned with the ubiquity of error. At conferences and in publications, it’s now completely acceptable for a researcher to simply say, ‘here is what I did, and here are my results.’ Presenters devote almost no time to explaining why the audience should believe that they found and corrected errors in their computations. The presentation’s core isn’t about the struggle to root out error — as it would be in mature fields — but is instead a sales pitch: an enthusiastic presentation of ideas and a breezy demo of an implementation. Computational science has nothing like the elaborate mechanisms of formal proof in mathematics or meta-analysis in empirical science. Many users of scientific computing aren’t even trying to follow a systematic, rigorous discipline that would in principle allow others to verify the claims they make. How dare we imagine that computational science, as routinely practiced, is reliable!” [donoho2009reproducible]

Donoho quote 2:

“In my own experience, error is ubiquitous in scientific computing, and one needs to work very diligently and energetically to eliminate it. One needs a very clear idea of what has been done in order to know where to look for likely sources of error. I often cannot really be sure what a student or colleague has done from his/her own presentation, and in fact often his/her description does not agree with my own understanding of what has been done, once I look carefully at the scripts. Actually, I find that researchers quite generally forget what they have done and misrepresent their computations.

Computing results are now being presented in a very loose, “breezy” way—in journal articles, in conferences, and in books. All too often one simply takes computations at face value. This is spectacularly against the evidence of my own experience. I would much rather that at talks and in referee reports, the possibility of such error were seriously examined.” [donoho2010invitation]

 [donoho2009reproducible] (1, 2) Donoho, David L, et al. 2009. Reproducible research in computational harmonic analysis. Computing in Science & Engineering 11, 8–18. http://www.stanford.edu/~vcs/papers/ReproducibleResearch20080811.pdf
 [donoho2010invitation] Donoho, David L. 2010. An invitation to reproducible computational research. Biostatistics 11, 385–388. http://biostatistics.oxfordjournals.org/content/11/3/385.full