Share

Monday, March 1, 2010

Google Help › Google Docs Help › Google Documents › Editing Documents › Inserting items › General › Mathematical equations

General: Mathematical equations
Print

You can easily insert mathematical equations into your documents. Here's how:

1. Click the Insert drop-down menu and select Equation. The equation editor dialog box appears.

equation editor dialog
2. Select the mathematical symbol you want to add from one of these menus:
* Greek letters
* General operators
* Comparison (and inclusion) operators
* Operators with variables
* Arrows
3. Click the symbol you'd like to include, and add numbers or substitute variables in the editor box. You'll also see a preview of the equation below the text area.
4. Click Insert equation to add the equation to your document.

If you'd like to edit the equation afterwards, simply click the equation within the document and click the edit link.

edit equation

B Fwd: Engines of Creation 2.0

---------- Forwarded message ----------
From: technologiclee <technologic...@gmail.com>
Date: Feb 16, 6:26 am
Subject: Engines of Creation 2.0
To: Open Manufacturing


Engines of Creation 2.0 is available to read online for free, or as
a .pdf download for $0.99

http://www.wowio.com/users/product.asp?BookId=503

B Fwd: Metamodern How to study for a career in nanotechnology



---------- Forwarded message ----------
From: Newsfeed to Email Gateway <emlynoregan@gmail.com>
Date: Tue, Feb 23, 2010 at 7:01 PM
Subject: Metamodern (1 new item)
To: technologiclee@gmail.com


Metamodern (1 new item)

Item 1 (02/24/10 00:02:19 UTC): How to study for a career in nanotechnology

Students often ask me for advice on how to study for a career in nanotechnology, and as you might imagine, providing a good answer is challenging. "Nanotechnology" refers to a notoriously broad range of areas of science and technology, and progress during a student's career will open new areas, and some are yet to be imagined. Choices within this complex and changing field should reflect a student's areas of interest and ability, current background, level of ambition, and willingness to to accept risk — there is a trade-off between pioneering new directions and seeking a secure career path.

Here is an attempt to give a useful answer that takes account of these unknowns. My advice centers on fundamentals, outlining areas of knowledge are are universally important, and offering suggestions for how to approach both specialized choices and learning in general. It includes observations about the future of nanotechnology, the context for future careers.

Learn the fundamentals, and not just in science

The most basic requirement for competence in any physical technology is a broad and solid understanding of the underlying physical sciences. Mathematics is the foundation of this foundation, and basic physics is the next layer. Classical mechanics and electromagnetics are universally important, and the concerns of nanotechnology elevate the importance of thermodynamics, statistical mechanics, and molecular quantum mechanics. A flexible competence in nanotechnology also requires a sound understanding of chemistry and chemical synthesis, of biomolecular structure and function, of intermolecular forces, and of solids and surfaces.

These are important areas of science, but science is not technology. As I've discussed in "The Antiparallel Structures of Science and Engineering", science and engineering are in a deep sense opposites, and must not be confused. Nanotechnology today is a science-intensive area of engineering, largely because the problem of designing a nanostructure is often overshadowed by the problem of finding, by experiment, a way to make it.

This has implications for choosing a course of study.

Engineering and progress in nanotechnology

A measure of progress in nanotechnology is growth of the range of physical systems that can be designed and debugged without extensive experimentation. As a basis for implementing nanoscale digital systems, commercial semiconductor fabrication provides a predictable design domain of this sort, and some areas of structural DNA nanotechnology have become almost as predictable as carpentry.

Computational tools are in a class of their own, an area of immaterial technology that applies to every area of material technology. It's important to understand the capabilities and limitations of these tools, and extending them makes a strategic contribution to progress. Computational tools tools are often the key to transforming reproducible processes and stable structures into reliable operations and building blocks for engineering. Today, better design tools are the key to unlocking the enormous potential of foldamers and self assembly as a basis for implementing complex nanosystems.

Competence in engineering — and understanding how science can support it — requires study of design principles and experience in solving design problems. As with physics, some lessons apply across many domains. Because nanotechnology relies on innovations in macro- and micro-scale equipment, engineering education has immediate and strong relevance. Looking forward, the growth of nanosystems engineering will open increasing opportunities for researchers with backgrounds that provide both the scientific knowledge necessary to understand new nanotechnologies and the engineering problem-solving abilities necessary to exploit them.

Students aiming to pioneer in directions that can open new worlds of nanotechnology should learn enough of both science and engineering to solve crucial problems at the interface between them. The most important of these is the problem of recognizing and developing the means for systematic engineering in new domains, extracting solid toolsets from the flood of novelty-oriented nanoscience.

In considering all of the above, keep in mind that the general direction of nanotechnology leads toward greater precision at the level of nanoscale components, making products of increasing complexity and size, implemented in an increasing range of materials. Molecular-level atomic precision has widespread applications in nanotechnology today, and already provides components with the ultimate precision at the smallest possible length scale. I expect that the road forward will increasingly focus on extending these atomically precise technologies toward greater scale, complexity, and materials quality. I recommend courses of study that prepare for this.

Choosing topics and ways to study them

In both science and engineering, a good methodology for selecting an ideal course of study would be to survey a course catalog and note which classes appear in lists of prerequisites for advanced classes in relevant areas of science and engineering. This indicates areas where it is important to study and master the content.

Courses toward the periphery of this network of prerequisites are good candidates for a different mode of study, a mode aimed at understanding the problems an area addresses, the methods used to solve them, and how those problems and methods fit in with the rest of science and technology. I discuss this mode of study in "How to Learn About Everything". It builds knowledge of a kind that can help a student choose topics that call for deeper, focused learning, and it can later help greatly in practical work — scientists and engineers with broader knowledge will see more opportunities and encounter fewer unanticipated problems. These advantages mean fewer days (months, years) lost and greater strides forward.

Choosing institutions

Beyond topics of study, I'm also asked to recommend universities and programs. It's difficult to give a specific answer, because a good choice depends on all of the above, and because for each of many areas of science and technology, there are many possible institutions, programs, and research groups. I can only advise that students facing this decision first consider their objectives, and then to look for institutions and people able to help them get there. In particular, universities must either offer a degree program that fits, or provide the flexibility to make one. I found a home in MIT's Interdisciplinary Science Program (which I can't recommend, because it no longer exists).

In undergraduate studies, the general breadth, orientation, and quality of a school is more important than any focused undergraduate program that it is likely to have.

Early involvement in research of almost any kind has a special value: It can provide knowledge of kinds that can't be learned from reading, from classes, or even from lab courses. Pay special attention to research that studies atomically precise structures of significant size and complexity. If that research has an engineering component — designing and making things — so much the better.


See also:

B Fwd: Metamodern Chemists deserve more credit: Atoms, Einstein, and the Matthew Effect



---------- Forwarded message ----------
From: Newsfeed to Email Gateway <emlynoregan@gmail.com>
Date: Wed, Feb 17, 2010 at 7:33 PM
Subject: Metamodern (1 new item)
To: technologiclee@gmail.com


Metamodern (1 new item)

Item 1 (02/17/10 23:41:52 UTC): Chemists deserve more credit: Atoms, Einstein, and the Matthew Effect

Cork cells, from Hooke's Micrographia
Johann Josef Loschmidt
Chemist, atomic scientist

Chemists understood the atomic structure of molecules in the 1800s, yet many say that Einstein established the existence of atoms in a paper on Brownian motion, "Die von der Molekularkinetischen Theorie der Wärme Gefordete Bewegung von in ruhenden Flüssigkeiten Suspendierten Teilchen", published in 1905.

This is perverse, and has seemed strange to me ever since I began reading the history of organic chemistry. Chemists often don't get the credit they deserve, and this provides an outstanding example.

For years, I've read statements like this:

[Einstein] offered an experimental test for the theory of heat and proof of the existence of atoms….
["The Hundredth Anniversary of Einstein's Annus Mirabilis"]

Perhaps this was so for physicists in thrall (or opposition) to the philosophical ideas of another physicist, Ernst Mach; he had odd convictions about the relationship between primate eyes and physical reality, and denied the reality of invisible atoms.

Confusion among physicists, however, gives reason for more (not less!) respect for the chemists who had gotten the facts right long before, and in more detail: that matter consists of atoms of distinct chemical elements, that the atoms of different elements have specific ratios of mass, and that molecules consist not only of groups of atoms, but of atoms linked by bonds ("Verwandtschaftseinheiten") to form specific structures.

When say "more detail", I mean a lot more detail than merely inferring that atoms exist. For example, organic chemists had deduced that carbon atoms form four bonds, typically (but not always) directed tetrahedrally, and that the resulting molecules can as a consequence have left- and right-handed forms.

The chemists' understanding of bonding had many non-trivial consequences. For example, it made the atomic structure of benzene a problem, and made a six-membered ring of atoms with alternating single and double bonds a solution to that problem. Data regarding chemical derivatives of benzene indicated a further problem, leading to the inference that the six bonds are equivalent. Decades later, quantum mechanics provided the explanation.

The evidence for these detailed and interwoven facts about atoms included a range of properties of gases, the compositions of compounds, the symmetric and asymmetric shapes of crystals, the rotation of polarized light, and the specific numbers of chemically distinct forms of molecules with related structures and identical numbers of atoms.

And chemists not only understood many facts about atoms, they understood how to make new molecular structures, pioneering the subtle methods of organic synthesis that are today an integral part of the leading edge of atomically precise nanotechnology.

All this atom-based knowledge and capability was in place, as I said, before 1900, courtesy of chemical research by scientists including Dalton, van 't Hoff, Kekulé, and Pasteur.

But was it really knowledge?

By "knowledge", I don't mean to imply that universal consensus had been achieved at the time, or that knowledge can ever be philosphically and absolutely certain, but I think the term fits:

A substantial community of scientists had a body of theory that explained a wide range of phenomena, including the many facets of the kinetic theory of gases and a host of chemical transformations, and more. That community of scientists grew, and progressively elaborated this body of atom-based theory and technology to up to the present day, and it was confirmed, explained, and extended by physics along the way.

Should we deny that this constituted knowledge, brush it all aside, and credit 20th century physics with establishing that atoms even exist? As I said: perverse.

But what about quantitative knowledge?

There is a more modest claim for Einstein's 1905 paper:

…the bridge between the microscopic and macroscopic world was built
by A. Einstein: his fundamental result expresses a macroscopic quantity — the coefficient of diffusion — in terms of microscopic data (elementary jumps of atoms or molecules).
["One and a Half Centuries of Diffusion: Fick, Einstein, Before and Beyond"]

This claim for the primacy of physics also seem dubious. A German chemist, Johann Josef Loschmidt, had already used macroscopic data to deduce the size of molecules in a gas. He built this quantitative bridge in a paper, "Zur Grösse der Luftmoleküle", published in 1865.


I had overlooked Loschmidt's accomplishment before today. I knew of Einstein's though, and of a phenomenon that the sociologists of science call the Matthew Effect.
See also:

B Fwd: Metamodern Cell-free biology



---------- Forwarded message ----------
From: Newsfeed to Email Gateway <emlynoregan@gmail.com>
Date: Thu, Feb 11, 2010 at 9:07 PM
Subject: Metamodern (1 new item)
To: technologiclee@gmail.com


Metamodern (1 new item)

Item 1 (02/12/10 00:09:31 UTC): Cell-free biology

Cork cells, from Hooke's Micrographia
"Cells"
(Courtesy, Robert Hooke)

Synthetic biology doesn't require cells, and in several ways, cells are liabilities.

Cells can make engineering difficult. Cell membranes and bacterial walls stand between new genes and the machinery needed to transcribe and translate them. They are barriers to liberating gene products. They contain systems that are complex products of eons of evolutionary history, not systems streamlined to simplify engineering. They are easily poisoned by what would be, to us, useful raw materials and products.

The state of the art in cell-free synthetic biology is already advanced, and moving forward rapidly:

Time and again, decreasing the dependence on cells has increased engineering flexibility with biopolymers and self-copying systems….

Current in vitro methods for synthesizing proteins and evolving protein, nucleic acid, and small-molecule ligands will be improved to accelerate production of new reagents, diagnostics, and drugs. New methods will be developed for synthesizing circular DNAs, modified RNAs, proteins containing unnatural amino acids, and liposomes.


Forster and Church, "Synthetic biology projects in vitro".

A glimpse of some recent developments:

Cell-free systems offer a unique platform for expanding the capabilities of natural biological systems for useful purposes, i.e. synthetic biology. They reduce complexity, remove structural barriers, and do not require the maintenance of cell viability. Cell-free systems, however, have been limited by their inability to co-activate multiple biochemical networks in a single integrated platform. Here, we report the assessment of biochemical reactions in an Escherichia coli cell-free platform designed to activate natural metabolism, the Cytomim system….


Jewett et al., "An integrated cell-free metabolic platform
for protein production and synthetic biology".

Networks of productive molecular machine systems need not be packaged in discrete, self-replicating units — not even when they start out that way.

B Fwd: Metamodern Exploiting strong, covalent bonds for self assembly of robust nanosystems



---------- Forwarded message ----------
From: Newsfeed to Email Gateway <emlynoregan@gmail.com>
Date: Sat, Feb 6, 2010 at 2:51 AM
Subject: Metamodern (1 new item)
To: technologiclee@gmail.com


Metamodern (1 new item)

Item 1 (02/06/10 07:37:14 UTC): Exploiting strong, covalent bonds for self assembly of robust nanosystems

Covalent organic framework
"Porous, Crystalline, Covalent
Organic Frameworks"
Côté et al.


Atomically precise self-assembly of complex structures can be engineered by providing for multiple binding interactions that

  1. Cooperate to stabilize the correct configuration, in a thermodynamic sense, and
  2. Do not stabilize any other configuration, in a kinetic sense

Roughly speaking, in the correct configuration, the parts fit together to allow all the binding interactions to operate simultaneously, and the system doesn't get stuck in other configurations. It's easy to see how weak interactions and cooperative binding can implement these conditions, but there are alternatives.

As I've discussed elsewhere, recent advances in biomimetic self assembly based on peptide and nucleic acid polymers provide a platform for developing complex, functional self-assembled systems, and in the right environments, some of these structures can be surprisingly robust. However, most of their characteristic binding interactions (hydrogen bonds, hydrophobic interactions, van der Waals interactions in well-packed structures, etc.) are weak in terms of both binding energy and mechanical strength.

Proteins structures, however, often include disulfide bonds (R1–S–S–R2), and these are covalent and strong. Their role in protein folding illustrates a key point:

Binding interactions in self-assembly must be labile,
but "labile" need not imply "weak."

Disulfide bonds can shuffle among different pairings through thiol/disulfide exchange,

R1–S + R2–S–S–R3  ⇔  R1–S–S–R2 + R3–S,

a process that can be fast in the presence of R–S ions. A well-folded structure will strongly favor correct pairings by holding a momentarily displaced R–S in a position to reform the bond. In thermodynamic terms, this decreases the entropy cost of the bond-forming reaction, and in kinetic terms, it increases the effective concentration that drives the forward reaction, typically accelerating it by an large factor (> 103). Exchange can be shut off by decreasing pH or removing free thiols from the folding environment.

The formation and hydrolysis of boronate esters can play a similar role in artificial self-assembling systems. A sample chapter from Boronic Acids (2005, posted by Wiley-VCH Verlag) provides an extensive discussion of the chemistry of boronic acid derivatives; it notes that boronic acids (at high pH, as hydroxyboronate anions) react with diols to form boronate esters with forward rate constants in the 103 – 104 M –1s–1 range. Hydrolysis is likewise fast. Boronate esters can be stabilized by reducing pH or removing water. They, and boronic acids, are generally biocompatible, and have even been developed as drugs, where they serve to bind carbohydrate moieties.

Here are some recent papers on self assembled systems that discuss boronic acid chemistry, along with other covalent chemistries of similar utility:

And a dissertation:


Self assembly need not be biomimetic.



B Fwd: Metamodern (8 new items)



---------- Forwarded message ----------
From: Newsfeed to Email Gateway <emlynoregan@gmail.com>
Date: Thu, Jan 28, 2010 at 7:13 PM
Subject: Metamodern (8 new items)
To: technologiclee@gmail.com


Metamodern (8 new items)

Item 1 (01/28/10 20:50:48 UTC): Self assembly and nanomachines: Complexity, motion, and computational control

A commenter on the previous post raised several important issues, and my reply grew into this post. The comment is here, and my reply follows:


@ Eniac — Thanks, you raise several important questions.

Regarding readiness to build extended, self assembling structures, yes, I think that the existing fabrication abilities (that is, the range of molecular structures that can be synthesized) are now more than adequate. The bottleneck is design software, including the development of rules that adequately (not perfectly) predict whether a given design satisfies a range of constraints. These include synthesis, stability, solubility, and sufficiently strong net binding interactions.

As for specifying face combinations that would result in unique binding, this becomes easier with increasing face size, and more difficult with the number of simultaneously exposed faces. Hierarchical assembly can address both of these, but the most practical schemes require the ability to convert reversible binding interactions into irreversible ones. One approach is to introduce covalent linkages after assembly of the intermediate blocks lower in the hierarchy of sizes. There are several ways to do this.

The problem of enabling motion between self-assembled components can be addressed at the level of interactions between assemblies that are held together by (for example) a combination of large-scale complementary shapes and non-contact colloidal binding interactions.

Flexible hinges in self-assembled structures are also practical, as shown by natural systems. Protein engineers have successfully designed structures that undergo conformational switching.


Downstream, there's a continuum of assembly approaches that spans the range between free Brownian motion, constrained Brownian motion, and more macro-machine-like devices (discussed in "From Self-Assembly to Mechanosynthesis", and Motors, Brownian Motors, and Brownian Mechanosynthesis).


You are right that the relative sizes of machines for manipulating matter and for manipulating information become similar (or reversed) at the nanoscale, relative to what we are familiar with in today's macro-machine, micro-computer world. The resulting design constraints can be met by a various combinations of several techniques, including

  • Offloading computation to conventional computers that direct what would typically be large numbers of nanosystems (a good early solution).
  • The same single-computer / multiple machine approach with nanosystems for both operations.
  • Extensive use of hard automation, in which repetitive operations require no computation at all.

Regarding the last point above, this is how high-throughput manufacturing works today. I've discussed this in posts with videos of machines in action: "High-Throughput Nanomanufacturing: Small Parts" and "High-Throughput Nanomanufacturing: Assembly," with a more quantitative discussion of "molecular mills" on E-drexler.com).


Item 2 (01/25/10 09:05:23 UTC): Self-assembling nanostructures: Building the building blocks

This post is prompted by a set of interrelated advances in chemistry that hold great promise for advancing the art of atomically precise fabrication. In this post, I'll describe an emerging class of modular synthesis methods for making a diverse set of small, complex molecular building blocks.

The road to complex self-assembled nanosystems starts with stable molecular building blocks, and the more choices, the better. Self-assembly and the folding of foldamers are similar processes: They work when parts fit together well, and in just one way. Having building blocks to choose from at the design stage will typically make possible a better fit, resulting in a denser, more stable structure.

Building blocks for building blocks for building blocks

I often think in terms of four levels of molecular assembly:

  • Specialized covalent chemistry to synthesize monomers
    (~1 nm)
  • Modular covalent chemistry to link monomers to make oligomers
    (~10 nm length)
  • Intramolecular self-assembly (folding) to make 3D objects
    (< 10 nm diameter)
  • Intermolecular self-assembly to make functional systems
    (~10–1000 nm)

Recent developments are blurring the first level into the second, however, because new modular chemistries can make complex structures that can serve a monomers at the next level of assembly. Perhaps the most outstanding example comes from Marty Burke's lab, which has pioneered a new, combinatorial methodology for piecing together small molecules of enormous diversity. From the lab website:

To most effectively harness the potential impact of complex small molecules on both science and medicine, it is critical to maximize the simplicity, efficiency, and flexibility with which these types of compounds can be synthesized in the laboratory.

…the process of peptide synthesis is routinely automated. As a result, this highly enabling methodology is accessible to a broad range of scientists. In sharp contrast, the laboratory synthesis of small molecules remains a relatively complex and non-systematized process. We are currently developing a simple and highly modular strategy for making small molecules which is analogous to peptide synthesis…

Our long term goal is to create a general and automated process for the simple and flexible construction of a broad range of complex small molecules, thereby making this powerful discovery engine widely accessible, even to the non-chemist.

In outline, the Burke group's method exploits iterative Suzuki-Miyaura coupling, a mild and increasingly general technique in which (in Burke's approach) carbon-carbon bond formation plays the role of amide bond formation in making peptides. In peptide synthesis, suitably-protected amino acids are iteratively coupled, deprotecting the terminal amine at each step. In Burke's method, suitably-protected boronic acids play the analogous role.

The key advance is the N-methyliminodiacetic acid (MIDA) protecting group, a trivalent ligand that rehybridizes the boron center from sp2 to sp3, thereby filling and blocking access to the open p orbital that makes trivalent boron compounds so wonderfully, gently reactive. The resulting complex is stable to a wide range of aggressive conditions, including powerful oxidants and strong acids. It can be removed, however, by an aqueous base (e.g., sodium bicarbonate in water).

For more information, good places to start are the Burke lab's research overview page, and the MIDA boronate technology spotlight page at Sigma-Aldrich, which also provides off-the-shelf MIDA-protected building blocks. Sigma-Aldrich offers a larger universe of boronic acids and boronic esters, as does CombiPhos Catalysts. It's worth looking through one of these documents to get a gut sense of what's now available. Impressive diversity, compared to the 20 standard amino acid side chains.

(For a general perspective on this direction of development, see "Controlled Iterative Cross-Coupling: On the Way to the Automation of Organic Synthesis", Angew. Chem. Int. Ed. 2009.)

More than a protecting group

The MIDA boronate ester is an example of a broader class of structures that are important in their own right. The demands of organic synthesis have brought forth a vast range of commercially available boronate esters (see links above), and this investment gives a free ride to scientists aiming to exploit them as building blocks. As linkers for self-assembled structures, boronate esters are both extraordinary and underexploited.

Relying a little less on hydrogen bonds, and a little more on bonds that can hold a self-assembled solid together at 600°C — dull red heat — could increase the robustness of self-assembled products. A fast, reversible, aqueous, biocompatible boron chemistry is what opens the door.

More later.


See also:


Item 3 (01/24/10 11:16:31 UTC): Boronate esters, Suzuki coupling, self-assembly, design software, etc.

Boronate + amine binding
… + 2 H2O, reversibly

I've been exploring some recent developments in chemical synthesis and self-assembly that suggest attractive possibilities for engineering robust self-assembling molecular systems. Boronate esters are involved in two ways.

Two days ago, I sat down to write about this, but then I read further into the literature, and learned substantially more. Yesterday, another cycle of the same. There's entirely too much relevant information and progress. Maybe tomorrow.



Item 4 (01/20/10 02:56:41 UTC): Why fusion won't provide power

The greatest problem with fusion power is rarely mentioned and not on the research agenda. When I discussed it earlier, in "Fusion Power: A New Way to Boil Water", I hadn't seen this (quietly damning) report, which I think is worth quoting:

Issues and R&D needs
for commercial fusion energy

An interim report of the
ARIES technical working groups

July 2008

From the introduction:

The goal of this activity is to provide guidance to the fusion energy sciences community based on industry requirements…

Buried among the discussions of plasma physics, neutron fluxes, and a host of practical engineering concerns, there is a page that briefly notes the "Achilles' Heel" that makes the rest look like an academic exercise. There is no mention of the problem in the introduction or the conclusions:

From page 22:

Fusion fuel is cheap, but the capital costs are high. This may be the Achilles Heel of economic fusion power. The capital costs must be lowered by significant amounts — an order of magnitude of cost reduction would be highly desirable but probably not attainable. Traditional cost cutting efforts offer marginal improvements and will not be sufficiently effective. Innovative approaches that promise orders of magnitude cost reductions on major items must be aggressively pursued… [This will require] new fabrication and production technologies….

Emphasis added.

Translation: There is no known way to build a remotely economical fusion power plant, even if the fuel is free and the plasma physics works perfectly.

The report speaks of potential, unspecified, orders-of-magnitude reductions in fabrication cost, but what would other technologies look like if evaluated by the same rules?

Advances that would drop the cost of future fusion power machines into a range competitive with current photovoltaic devices are on a scale that would drop the cost of future photovoltaic devices to almost nothing.


As I showed before, here's the planned ITER reactor, including the high-vacuum chamber and its surrounding high-field superconducting magnets, together with the requisite particle accelerators, power systems, etc.,. Ordinary nuclear reactors are mostly plumbing; this is a fancy physics apparatus, more nearly comparable to the Large Hadron Collider.

For scale, note the person in the blue coat standing at the bottom:


The plasma physics problems are a fascinating distraction from the physics of advanced fabrication. (This would, admittedly, solve the cost problem.)

See also:


Item 5 (01/17/10 20:44:35 UTC): The importance of seeing what isn't there

The Edge Annual Question — 2010 asks "How is the Internet changing the way you think?", with answers by (to borrow from the Edge description) "an array of world-class scientists, artists, and creative thinkers" that includes technology analyst Nicholas Carr, social software guru Clay Shirky, science historian George Dyson, and Web 2.0 pioneer Tim O'Reilly, among many others (Richard Dawkins, Nicholas Taleb, Marin Rees, Sean Carroll…). The landscape of social cognition is changing, and the authors offer many views and maps.

In my answer I discuss how the Internet boosts the growth of human knowledge in a way that is powerful and yet — by nature — almost invisible: It helps us see what's missing:


THE WEB HELPS US SEE WHAT ISN'T THERE

As the Web becomes more comprehensive and searchable, it helps us see what's missing in the world. The emergence of more effective ways to detect the absence of a piece of knowledge is a subtle and slowly emerging contribution of the Web, yet important to the growth of human knowledge. I think we all use absence-detection when we try to squeeze information out of the Web. I think it's worth considering both how it works and how it could be be made more reliable and user-friendly.

The contributions of absence-detection to the growth of shared knowledge are relatively subtle. Absences themselves are invisible, and when they are recognized (often tentatively), they usually operate indirectly, by influencing the thinking of people who create and evaluate knowledge. Nonetheless, the potential benefits of better absence-detection can be measured on the same scale as the most important questions of our time, because improved absence-detection could help societies blunder toward somewhat better decisions about those questions.

Absence-detection boosts the growth of shared human knowledge in at least three ways:

Development of knowledge: Generally, for shared knowledge to grow, someone must invest effort to develop a novel idea into something more substantial (resulting in a blog post, a doctoral dissertation, or whatever). A potential knowledge-creator may need some degree of confidence that the expected result doesn't already exist. Better absence-detection can help build that confidence — or drop it to zero and abort a costly duplication.

Validation of knowledge: For shared knowledge to grow, something that looks like knowledge must gain enough credibility to be treated as knowledge. Some knowledge is born with credibility, inherited from a credible source, yet new knowledge, supported by evidence, can be discredited by arguments backed by nothing but noise. A crucial form of evidence for a proposition is sometimes the absence of credible evidence against it.

Destruction of anti-knowledge: Shared knowledge can also grow through removal of of anti-knowledge, for example, by discrediting false ideas that had displaced or discredited true ones. Mirroring validation, a crucial form of evidence against the credibility of a proposition is sometimes the absence of credible evidence for it.

Identifying what is absent by observation is inherently more difficult than identifying what is present, and conclusions about absences are usually substantially less certain. The very idea runs counter to the adage, being based on the principle that absence of evidence sometimes is evidence of absence. This can be obvious: What makes you think there's no elephant in your room? Of course, good intellectual housekeeping demands that reasoning of this sort be used with care. Perceptible evidence must be comprehensive enough that a particular absence, in a particular place, is significant: I'm not at all sure that there's no gnat in my room, and can't be entirely sure that there's no elephant in my neighbor's yard.

Reasonably reliable absence-detection through the Web requires both good search and dense information, and this is one reason why the Web becomes effective for the task only slowly, unevenly, and almost imperceptibly. Early on, an absence in the Web shows a gap in the Web; only later does an absence begin to suggest a gap in the world itself.

I think there's a better way to detect absences, one that bypasses ad hocsearch by creating a public place where knowledge comes into focus:

We could benefit immensely from a medium that is as good at representing factual controversies as Wikipedia is at representing factual consensus.

What I mean by this is a social software system and community much like Wikipedia — perhaps an organic offshoot — that would operate to draw forth and present what is, roughly speaking, the best evidence on each side of a factual controversy. To function well would require a core community that shares many of the Wikipedia norms, but would invite advocates to present a far-from-neutral point of view. In an effective system of this sort, competitive pressures would drive competent advocates to participate, and incentives and constraints inherent in the dynamics and structure of the medium would drive advocates to pit their best arguments head-to-head and point-by-point against the other side's best arguments. Ignoring or caricaturing opposing arguments simply wouldn't work, and unsupported arguments would become more recognizable.

Success in such an innovation would provide a single place to look for the best arguments that support a point in a debate, and with these, the best counter-arguments — a single place where the absence of a good argument would be good reason to think that none exists.

The most important debates could be expected to gain traction early. The science of climate change comes to mind, but there are many others. The benefits of more effective absence-detection could be immense and concrete.


See also:


Item 6 (01/13/10 08:37:46 UTC): Templates for atomically precise metal-oxide nanostructures

Polyoxometalate nanostructure on the cover of Science
The center templates the ring
"Unveiling the Transient Template
in the Self-Assembly of a
Molecular Oxide Nanowheel"
HN Miras et al., Science,
327:72–74 (2010).

The cover of Science features atomically precise inorganic nanostructures, polyoxometalates (POMs), that form by means of atomically precise templates. The outer rings of these structures contain 150 molybdenum atoms.

POMs are a diverse class of nanoscale metal-oxide structures with characteristics that make them remarkably attractive as potential components for self-assembled composite nanosystems.

These characteristics include:

  • Atomically precise structures
  • Diverse sizes, shapes, properties, and functions
  • Good mechanical stiffness
  • Facile aqueous synthesis (see below)
  • Biomolecular compatibility

I discussed some of their properties and potential applications in an earlier post, "Polyoxometalate Nanostructures".

The latest paper from Lee Cronin's lab points to new strategies for making POMs. The authors first discuss the power of templating strategies in organic synthesis, then observe that

The discovery of a similar templating strategy for the reliable fabrication of 2- to 10-nm molecular nanoparticles would revolutionize the synthesis and applications of molecular materials in the same way that templated synthesis has revolutionized the field of organic macrocyclic synthesis over the past 40 years.
[...]
Our results illustrate how a bottom-up assembly process can be used to rapidly obtain gram quantities of a nanomaterial with well-defined size, shape, and composition.

By "well-defined size, shape, and composition", they mean atomically precise.

I'd like to see experiments that explore possibilities for synthesizing POMs on protein templates (there's been work on POM synthesis in protein cavities). I'd expect that screening combinations of proteins and POM-forming solutions would yield new structures, and perhaps show the way to rational engineering of POMs through rational engineering of proteins. This would open another bridge between biomolecular and inorganic nanotechnologies.



Apparatus for polyoxometalate synthesis

Synthesis:

The following solutions prepared in distilled and degassed H2O as follows: Solution A: 60 mL (3 M) solution of Na2MoO4·2H2O; Solution B: 60 mL (5M) HCl; Solution C: 60 mL (0.29 M) of Na2S2O4; Solution D: 250 mL (0.2 M) K2MoO4; Solution E: 250 mL (0.4 M) of HNO3. Solutions A, B and C reacted in the mixing chamber 1 using a flow rate of 4 mL / h. Solutions D and E reacted in the mixing chamber 2 using the same flow rate. The outputs of the mixing chamber 1 and 2 reacted in the mixing chamber 3 giving the desirable product in crystalline form within 24 h and under flow conditions in the collection tank.
[from the online supporting material]


See also:

Item 7 (01/09/10 22:27:48 UTC): The Wall Street Journal on Feynman, Drexler, History, and the Future

The Wall Street Journal published an article yesterday, "Feynman and the Futurists", about Feynman's ideas, mine, how the nanotechnology bandwagon got rolling, and how the band got thrown off the wagon — and then, out of the shadows, the NRC report and why the U.S. government should implement the NRC's recommendations.

The author, Adam Keiper, is editor of The New Atlantis and a fellow at the Ethics and Public Policy Center. Toward the end of the article, he notes that the National Research Council has recommended initiating Federal research directed toward molecular manufacturing (the subject of my previous post) and laments that none of the federal nanotechnology R&D funding has gone toward "the basic exploratory experiments that the National Research Council called for in 2006". In closing, he says:

If Drexler's revolutionary vision of nanotechnology is feasible, we should pursue it for its potential for good, while mindful of the dangers it may pose to human beings and society. And if Drexler's ideas are fundamentally flawed, we should find out—and establish just how much room there is at the bottom after all.

Mr. Keiper wrote this in commemoration of the recent 50th anniversary of Feynman's talk, "There's Plenty of Room at the Bottom".

He's followed the ugly science-funding politics around advanced nanotechnology for many years now. What he says about this on target, and he says more than I've been willing to say here.

In fact, the whole article is uncommonly accurate. Writers usually add several ladles of bilge-water to the soup, but in this article, my main wish would have been for more meat and spices:

  • More about the scientific basis for the concept of molecular manufacturing (in scientific publications, doctoral work), to balance the talk about implausible prospective wonders,
  • Mention of the enormous progress on the research agenda that I've advocated from 1981 forward (new fields of science, tens of thousands of papers), to correct the mistaken impression that no Federal R&D funding has gone toward "the kind of nanotechnology that Drexler proposed",
  • In connection with the science-funding politics that Mr. Keiper describes, it would be pertinent to mention the post-2000 redefinition of what "nanotechnology" is, and the reversal of position regarding what it can do; this is on the record in public statements* and official documents.

These sins of omission, though, are overshadowed by Mr. Keiper's service to the public in highlighting the National Research Council report and its findings.


Full disclosure: I usually call Mr. Keiper "Adam".

Here's a particularly clear example of double-talk:
  • In 1999, Person A testified to a Senate committee about the wonders of "what will be possible when we learn to build things at the ultimate level of control, one atom at a time….putting atoms where you want them to go."
  • In 2001, Person B told readers of Scientific American, that, contrary to Person A, Feynman, and me, this atom-level control was absurd: "To put every atom in its place — the vision articulated by some nanotechnologists — would require magic fingers….'There's plenty of room at the bottom' But there's not that much room."

Remarkably, Person A = Person B. Official documents redefine the scope and objectives of "nanotechnology" during across the same time interval: Before funding, it's all about atomically precise fabrication, once funded, it's not.


See also:

[Misc. revisions, 11 January]

Item 8 (01/07/10 23:00:59 UTC): Molecular Manufacturing: The NRC study and its recommendations

Part 6 of a series prompted by the recent 50th anniversary of Feynman's historic talk, "There's Plenty of Room at the Bottom". This is arguably the most important post of the series, or of this blog to date.

Topics:
— The most credible study of molecular manufacturing to date
— The study's recommendations for Federal research support
— The current state of progress toward implementation
— The critical problem: not science, but institutions and focus


Triennial Review cover

A Matter of Size:
Triennial Review
of the National Nanotechnology Initiative

Committee to Review the
National Nanotechnology Initiative,
National Research Council

(full document [pdf])

A formal, Federal-level study has examined the physical principles of high-throughput atomically precise manufacturing (aka molecular manufacturing), assessing its feasibility and closing with a call for experimental research.

Surprisingly, this recommendation smacks of heresy in some circles, and the very idea of examining the subject met strong opposition.

The process in outline: Congress voted to direct the U.S. National Research Council, the working arm of the U.S. National Academies, to conduct, as part of the lengthy Triennial Review of the National Nanotechnology Initiative, what in the House version had been described as a "Study on molecular manufacturing…to determine the technical feasibility of the manufacture of materials and devices at the molecular scale", and in response, the NRC convened a study committee that organized a workshop, examined the literature, deliberated, and reported their conclusions, recommending appropriate research directions for moving the field forward, including experimental research directed toward development of molecular manufacturing.

NRC studies are not haphazard processes, and the National Academies website describes its procedures in substantial detail. Because the NRC often advises the Federal government on politically charged questions, "Checks and balances are applied at every step in the study process to protect the integrity of the reports and to maintain public confidence in them." These include independent scientific review of reports that are themselves the product of independent experts assembled with attention to potential conflicts of interest.

It's worth taking a moment to compare the NRC to the three previous leading sources of information on molecular manufacturing: committed advocates, committed critics, and self-propagating mythologies. None of these is remotely comparable. Unless one has studied the topic closely and in technical detail, it seems reasonable to adopt the committee's conclusions as a rough-draft version of reality, and to proceed from there.

Here are some excerpts that I think deserve special emphasis, followed by the concluding paragraph of the report:



Technical Feasibility of Site-Specific Chemistry
for Large-Scale Manufacturing

The proposed manufacturing systems can be viewed as highly miniaturized, highly articulated versions of today's scanning probe systems, or perhaps as engineered ribosome-like systems…
[...]
…The technical arguments make use of accepted scientific knowledge but constitute a "theoretical analysis demonstrating the possibility of a class of as-yet unrealizable devices."22
[...]
Construction of extended structures with three-dimensional covalent bonding may be easy to conceive and might be readily accomplished, but only by using tools that do not yet exist.25 In other words, the tool structures and other components cannot yet be built, but they can be computationally modeled.
[ ... concluding paragraph:]
Although theoretical calculations can be made today, the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time. Thus, the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence. Finally, the optimum research paths [to advanced systems] cannot be reliably predicted at this time. Research funding that is based on the ability of investigators to produce experimental demonstrations that link to abstract models and guide long-term vision is most appropriate to achieve this goal.


22. K.E. Drexler. 1992. Nanosystems, Molecular Machinery, Manufacturing and Computation. New York: Wiley & Sons.
[see page on my website]

25. M. Rieth and W. Schommers, eds. 2005. Handbook of Computational and Theoretical Nanotechnology. American Scientific Publishers.
[see chapter pdf]

Source pages, NRC report [pdf ]

My summary in a nutshell:

The committee examined the concept of advanced molecular manufacturing, and found that the analysis of its physical principles is based on accepted scientific knowledge, and that it addresses the major technical questions. However, in the committee's view, theoretical calculations are insufficient: Only experimental research can reliably answer the critical questions and move the technology toward implementation. Research in this direction deserves support.

I should note that the tone of the report is skeptical, emphasizing what the committee [correctly] sees as the unusual approach and the [resulting, methodologically inherent] incompleteness of the results. A quick skim could easily suggest a negative assessment. A closer reading, however, shows that points raised are in the end presented, not as errors, nor even as specific, concrete weaknesses in the analysis, but instead as work not yet done, motivating the development of a research program directed toward validating and achieving the proposed technological objectives.


The call for research

The report closes with a call for research on pathways toward molecular manufacturing, quoted above, and an earlier section outlines some appropriate objectives:

To bring this field forward, meaningful connections are needed between the relevant scientific communities. Examples include:

  • Delineating desirable research directions not already being pursued by the biochemistry community;
  • Defining and focusing on some basic experimental steps that are critical to advancing long-term goals; and
  • Outlining some "proof-of-principle" studies that, if successful, would provide knowledge or engineering demonstrations of key principles or components with immediate value.

The response and progress

The technology roadmap

Research directions toward molecular manufacturing have been charted in the subsequent Technology Roadmap for Productive Nanosystems, the result of a project led by the Battelle Memorial Institute, the manager of research at U.S. National Laboratories that include Pacific Northwest, Oak Ridge, and Brookhaven. These labs hosted several Roadmap workshops and provided many of the participating scientists and engineers; I served as the lead technical consultant for the project.

The Roadmap is responsive to the NRC request above, and recommends research that includes work along the lines I describe below.

Molecular engineering methodologies

The crucial research objective is the development of systematic experimental and design methodologies that enable the fabrication of large, multicomponent, atomically precise nanostructures by means of self-assembly. This research direction fits the NRC committee's criteria: it is, by nature, strongly experimental, and in mimicking macromolecular structures and processes in biology, it holds promise for near-term biomedical applications.

Structural DNA nanotechnology

In the year the NRC report reached print, a Nature paper reported a breakthrough-level development, "DNA origami". This technology opened the door to systematic, atomically precise engineering on a scale of hundreds of nanometers and millions of atoms.

Since then, we've seen rapid progress in structural DNA nanotechnology. I discussed recent landmark achievements here and here.

Polypeptide foldamer nanotechnology

There's also been rapid progress in design methodologies for complex, atomically precise nanoscale structures made from polypeptide foldamers (aka proteins). In recent years, protein engineering has achieved a functional milestone: systematically engineering devices that perform controlled molecular transformations (see "Computational tools for designing and engineering biocatalysts").

Framework-directed assembly of composite systems

Looking forward, promising next steps involve integrating structural DNA frameworks with polypeptide foldamers, other foldamers, and other organic and inorganic materials. These classes of components have complementary properties (as discussed in my comments on "Modular Molecular Composite Nanosystems").

Here, too, progress has been extensive. For DNA-centered perspectives, see "DNA origami as a nanoscale template for protein assembly", "Assembling Materials with DNA as the Guide", and "DNA-templated nanofabrication". For a review of polypeptide-centered perspectives, see "Molecular biomimetics: nanotechnology and bionanotechnology using genetically engineered peptides".

Why these developments are important

As is now well recognized, "existing biological systems for protein fabrication could be harnessed to produce nanoscale molecular machines with designed functions" ("Computational protein design promises to revolutionize protein engineering"). Further, as biological systems demonstrate, programmable molecular machine systems can be harnessed to build programmable molecular machine systems.

As I've discussed, this capability could be exploited to pursue a spiral of improvement in materials, components, and molecular machine systems.

The path ahead

This spiral of development, in which molecular tools are used to construct more capable next-generation molecular tools, could be exploited to develop products with expanding applications, falling cost, and increasing value.

As I discussed in "Making vs. Modeling: A paradox of progress in nanotechnology", each generation of tools can be expected to enable fabrication processes and products that are more robust, more susceptible to computational simulation, and better suited to established systems engineering design methodologies. This indicates the potential for an accelerating pace of development toward a technology platform that can support the implementation of high-throughput atomically precise fabrication.

This path is being followed today, yet the level of support and organization, of mission and urgency, does not come close to matching its potential for solving long-term yet urgent problems.

Appropriate and inappropriate responses
to the NRC report on molecular manufacturing

The evaluation of the feasibility of molecular manufacturing and recommendations for research form the concluding section of the body of the NRC's Triennial Review of the National Nanotechnology Initiative. In the three years since the publication of the NRC report, I have seen no document from a Federal-level source that acknowledges these conclusions, and, of course, none that offers a substantive response.

This is of concern, because the NRC report calls for a sharp break with past thinking. To put it bluntly, much of the opinion in general circulation about molecular manufacturing (both pro and con) is rubbish because it is based on mythology, and not on the scientific literature. The NRC report can be criticized on several points, but it isn't rubbish.

Fulfilling the initial promise of nanotechnology

Atomically precise fabrication technologies exist today, and as I have noted, advanced atomically precise fabrication is the promise that initially defined the field of nanotechnology. I believe the record shows that advanced atomically precise fabrication is also the promise that got it funded.

Building on recent advances, strategically targeted research in atomically precise fabrication could draw on and contribute to fields across the spectrum of modern nanotechnologies, from materials to devices, and could bring them together to elevate the technology platform for further advances. Ultimately, as the NRC report suggests, those advances could potentially deliver what was promised at the inception of the field.

Make no mistake: the path to high-throughput atomically precise manufacturing will not be short, and it will not be direct. It will be a multi-stage development process, and as I have discussed, the early steps differ greatly from the ultimate results in both their form and their potential applications.

Growing urgency

Today, the potential promise of high-throughput atomically precise manufacturing must be regarded as credible. As a consequence of its inherent productive capacity, it offers a credible potential solution to problems of energy production and climate change. The National Research Council of the U.S. National Academies of Science, Engineering, and Medicine has called for the support of research explicitly directed toward the development of this technology. This has become urgent.

The strength and limitations of current research support

It is both laudable and problematic that the research I've reported above is chiefly funded by programs in biology and medicine. This support has enabled great progress, and I know from long discussion that researchers in these areas have ambitious visions for the future. There are, however, limits to what can be achieved while developing molecular engineering within the framework of biotechnology, much as there would have been if aeronautical engineering research had been developed as a field of ornithology.

The critical need today is not for new scientific results, but for an integrative approach to molecular systems engineering, directed toward strategic technology objectives. The science is ready. The institutions are not.



A word to readers:

The implications of the NRC report call for reconsidering views that have shaped policy in the research disciplines critical to progress toward molecular manufacturing, yet like many other NRC reports, it is virtually unknown. Directing other readers to what I have written here could help to remedy this problem.

(And a further note to readers who are bursting with frustration: Please don't. It is counterproductive, and generates far more heat than light.)


Note: I say in the first paragraph that Congress voted for "…what in the House version had been described as a 'Study on molecular manufacturing…to determine the technical feasibility of the manufacture of materials and devices at the molecular scale'" to reflect an oddity of the legislative history behind the study: After the House transmitted the bill to the Senate, a nanotechnology business association successfully lobbied to replace "molecular manufacturing", thereby calling for a (puzzling) "Study on molecular self-assembly". An uproar followed. In the end, the NRC did a study of molecular self-assembly, as directed in the final bill, but also responded to the request by the House for a study of molecular manufacturing. In the end, molecular manufacturing dominated the agenda of the workshop. [I corrected the main text and this description after reviewing the GPO documents, several hours after the initial posting.]

In a later section, I note that "I have seen no document from a Federal-level source that acknowledges these conclusions". There is, in fact, a document that quotes from the conclusions, but the quoted material is edited in a way that wrongly indicates that the recommendations regarding molecular manufacturing are, instead, recommendations regarding molecular self-assembly (see "The National Nanotechnology Initiative: Second Assessment and Recommendations of the National Nanotechnology Advisory Panel", p.43).


[Dec 8: Updated to add the paragraph beginning "I should note that the tone of the report is skeptical..." I would expect this tone to strongly influence the impression left on casual readers, blunting the impact of what, in substance, amounts to a sharp rebuke to the conventional wisdom.]
An open comment thread for this post can be found here.

B Fwd: Metamodern Ribo-Q1: Genetic manufacturing expanded



---------- Forwarded message ----------
From: Newsfeed to Email Gateway <emlynoregan@gmail.com>
Date: Mon, Mar 1, 2010 at 4:30 AM
Subject: Metamodern (1 new item)
To: technologiclee@gmail.com


Metamodern (1 new item)

Item 1 (03/01/10 06:17:59 UTC): Ribo-Q1: Genetic manufacturing expanded

Unnatural amino acids
Unnatural amino acids compatible with ribosomes
(circled: azide, alkyne,
and biotin derivative)

From Neumann et al., 2010 and Dougherty, 2000.

All ribosomes read genetic data as three-letter words that encode 20 standard amino acids (give or take a few anomalies). This is equally true of the ribosomes in deep-sea bacteria living at 120°C, and the ones in your thumb. This universal code has been a wall that bounds the scope of biosynthetic polypeptide engineering — until now.

Recent developments have cracked the wall by tweaking the code, but Jason Chin's group in the UK has blasted a wide hole by expanding the address space.

From the abstract of a paper soon to be published in Nature:

[E]very triplet codon in the universal genetic code is used in encoding the synthesis of the proteome….Here we synthetically evolve an orthogonal ribosome (ribo-Q1) that efficiently decodes a series of quadruplet codons…. By creating mutually orthogonal aminoacyl-tRNA synthetase–tRNA pairs and combining them with ribo-Q1 we direct the incorporation of distinct unnatural amino acids…. it will be possible to encode more than 200 unnatural amino acid combinations using this approach.


H Neumann et al., "Encoding multiple unnatural amino acids via evolution of a quadruplet-decoding ribosome", Nature (early online publication).

I've selected some examples (image above) to illustrate the scope of these methods. Each of these amino acids (some highly unnatural) has already been used as a building block in ribosomal polypeptide synthesis. Together, they provide a glimpse of the vast new world now opening to molecular engineers. Polypeptides (of the sort usually called "proteins") are already a family of versatile, high-performance engineering polymers, and an expanded set of building blocks can be exploited to increase thermodynamic stability, extend useful functionality, facilitate self assembly, and enable more systematic design.

Realizing this potential for expanding the scope of protein engineering will require extensive development of new tools, including new aminoacyl-tRNA synthetase–tRNA pairs. Because these are themselves proteins, there will be increasing opportunities for bootstrapping, using the new tools to facilitate development of those that follow. For example, could task-specific side chains (perhaps PNA oligomers) facilitate the development of new aminoacyl-tRNA synthetases?


By the way, even the amide bond in the backbone isn't sacred: ribosomes happily make esters, too. Unlike enzyme-like, substrate-specific catalysts, ribosomes are machines for positioning reactants bound to handles. Their substantial generality is characteristic of handle-based mechanosynthetic catalysis.
See also: