Saturday, November 19, 2016

Stanford CSP. Business 152. Innovation Timing. Session 1, Quiz 2.

On November 14, 2016, the New York Times wrote a story about a Spark Capital, a 11-year old technology venture firm.

The article opened with a description of one of Spark's recent deals:
Betting on an automated driving start-up in 2015 may not have been the most intuitive gamble at a time when Google and Uber had already declared that self-driving vehicles were among their top research priorities. 
But in the fall of 2015, Spark Capital was one of a few established venture capital firms to wade into the industry, helping lead a $12.5 million investment in Cruise Automation, a start-up based in San Francisco whose software helps cars pilot themselves. One of Spark’s partners became the only outside board member of the firm. 
It was a bet that paid off quickly: Within six months, Cruise sold itself to General Motors for about $1 billion.

1. In your opinion, why the timing of the deal turned out to be so good? Was it pure luck? If you were the analyst who "discovered" Cruise Automation back in 2015, how would you justify the $12.5M investment to your VC partners?
2. (optional) Consider the generic innovation diffusion S-curve, as described in Everett Rogers' "Diffusion of Innovations." What stage of the curve the self-driving car technology is now? Why?

tags: course, stanford, innovation, s-curve

Thursday, November 17, 2016

From junk food to junk news

Remarkably, surrounded by an abundance of choices people chose what feels good, not what is good. With junk food, it's a combination of fat, sugar and salt that fools taste buds into craving for more. With junk news, it's the confirmation bias that fools brains into craving for more news that conform to their world view.

According to Buzzfeed, during the 2016 election cycle fake news outperformed real news.

Due the difference in feedback mechanisms, the situation with junk news is worse than with junk food. That is, after having a junk food diet for an extended period of time, people can at least use scales to discover that their weight has gone up. By contrast, after having a junk news brain diet, people can only get stronger in their opinions because their social network keeps rewarding them for consuming and sharing the junk.

Can we solve the problem without resorting to censorship? One way to look at it would be to consider the situation from a point of view widely adopted in another domain - money and finance. That is, today's fake money is easily detected and discarded, so that the society doesn't fall into the trap of the Gresham's Law. Similarly, fake news can be detected by a variety of technologies, including a BitCoin-like approach that verifies authenticity of the news and news sources. Fake news, like fake coins should be taken out of circulation. Otherwise, our brains get stupid by consuming junk news, just like our bodies can get fat, by consuming junk food.

Lunch Talk: Counterintuitive approach to building startups (Stanford University)

This is Lecture 3 from a Stanford University course "How to start a startup". The speaker is Paul Graham; his transcript is here:

tags: startup, stanford, entrepreneurship, innovation, lunchtalk,

Wednesday, November 16, 2016

Stanford CSP. Business 152. Innovation Timing. Session 1, Quiz 1.


Timing is critical for innovation success. Sometimes, companies introduce new products and services when it’s too late. For example, neither Google+ nor Microsoft smartphone succeeded, despite their respective companies putting major resources behind them, both in money and development efforts.

On the other extreme, some innovations fail because they seem to appear too early. Social networks Friendster and Livejournal started gaining traction in the early 2000s, but never got to the scale of Facebook. Similarly, WebTV started in 1995, with the intent to provide users with a broad range of content over the Internet. In almost two decades that followed, the company burned through hundreds of millions of dollars, went through a major acquisition, but failed in the very same marketplace where NetFlix and other video streaming services managed to succeed a few years later.

Finally, certain products and services had perfect timing. For example, Gmail and Youtube spread like a California wildfire. The Apple iPhone succeeded where Apple Newton failed.

In preparation for the course, please answer the following questions:

1. List 2-3 novel products or services in each of the timing categories:
a) too late;
b) too early;
c) just perfect.

2. (Optional). Pick one example from the list and explain your reasoning with regard to innovation timing. Mention at least 3 factors that played a role in the success or failure of the innovation.

Wednesday, July 20, 2016

Stanford CSP BUS 74 [Principles of Invention and Innovation], Session 4 Quiz 1

On July 19, 2016, Bloomberg Technology News reported that Google used its DeepMind AI technology to reduce power consumption in the company's data centers:
In recent months, the Alphabet Inc. unit put a DeepMind AI system to reduce power consumption by manipulating computer servers and related equipment like cooling systems. It uses a similar technique to DeepMind software that taught itself to play Atari video games, Hassabis said in an interview at a recent AI conference in New York.

The system cut power usage in the data centers by several percentage points, "which is a huge saving in terms of cost but, also, great for the environment," he said.

The savings translate into a 15 percent improvement in power usage efficiency, or PUE, Google said in a statement. PUE measures how much electricity Google uses for its computers, versus the supporting infrastructure like cooling systems.

Question 1. Using the system model, name the functional element that DeepMind technology helps to improve directly.
Question 2. Based on what you know about the improved element, describe other functional elements within the same system.

A new way to map brains

Neuroscientists at Washington University Medical School created a method to build maps for individual brains:

(MIT Tech Review) Researcher Matthew Glasser says that unlike many previous studies, this map considers several features of the brain simultaneously to mark its boundaries. Some neuroscientists still define brain regions based on a historical map called Brodmann’s areas that was published in 1909. That map divided each half of the brain into 52 regions. Each hemisphere on the new map has 180 regions.

Glasser defined these regions by looking for places where multiple traits—such as the thickness of the cortex, its function, or its connectivity to other regions—were changing together. After drawing the map onto one set of brains, the researchers developed an algorithm to recognize the regions in a new set of brains where the size and boundaries vary from person to person. “It’s not just a map that people can make reference to,” Glasser says. “You can actually find the areas in the individuals that somebody is studying.”

From an innovation perspective, mapping methods create opportunities to systematically explore and coordinate knowledge about a broad class of objects. This particular approach enables scientists and engineers to move back and forth from generalized information about human brain to specific aspects in a particular brain. For example, we might be able to understand why 3D VR can replace painkillers in some medical applications.

Wednesday, July 13, 2016

Stanford CSP, BUS 74. Session 3, Quiz 1.

In a recent NYT article titled "Ads Evolve Into New Forms as Media Landscape Shifts", Sydney Ember mentioned an emerging trend in the advertisement industry:
Consumption habits have become increasingly fragmented, with more people watching programming, including television shows and live sports, on different online platforms. As a result, traditional television, with its 30-second commercials, is losing its commanding share of advertising dollars. Digital media is expected to pass TV as the biggest advertising category in the United States this year, with roughly $68 billion in ad sales compared with $66 billion for TV, according to the Interpublic Group’s Magna Global.

With online ad spending growing, finding ways to stand out among the onslaught of other online ads has become more important for advertisers. And therein lies a possible conundrum: Advertisers want their ads to look less like ads even as they are fighting harder for attention.

Question 1.
Based on our brief class discussion (see slide 33 Lecture Notes from July 11, 2016) and an earlier post on this blog, use the 10X Change diagram to map ad-related business models mentioned in class. Briefly explain parameters for each model.
(a ppt version of the 10X Change diagram is available for download here).

Question 2 (bonus).
What major technology developments enabled key ("disruptive") business model transitions?

Question 3 (bonus). Use the 10X Change diagram to map potential ad-related business models that are now available with augmented reality games like Nintendo Go. What technologies (existing or new) can further improve such models?

Thursday, June 30, 2016

Lunch Talk: Moral Tribes (Joshua Greene gives a talk at Google)

Note how his experiments show the relationship b/w physical distance and psychological distance. A similar effect happens when inventors are trying to explain their ideas to investors. I also like his analogy between Kahneman's System 1 vs System 2 on one the hand, and point-and-shoot and SLR cameras on the other: the former is set on automatic, while the latter on manual.

Tuesday, June 28, 2016

Stanford CSP 74 Principles of Invention and Innovation (BUS 74). Session 2 Quiz 1

In a recent MIT Technology Review article, Antonio Regaldo describes a new genetic engineering approach that promises to eliminate malaria:
Malaria kills half a million people each year, mostly children in tropical Africa. The price tag for eradicating the disease is estimated at more than $100 billion over 15 years. To do it, you’d need bed nets for everyone, tens of thousands of crates of antimalaria drugs, and millions of gallons of insecticides.
A gene drive is an artificial “selfish” gene capable of forcing itself into 99 percent of an organism’s offspring instead of the usual half. And because this particular gene causes female mosquitoes to become sterile, within about 11 generations—or in about one year—its spread would doom any population of mosquitoes. If released into the field, the technology could bring about the extinction of malaria mosquitoes and, possibly, cease transmission of the disease.

Question 1: Using the "Divergeng-Exploratory-Convergent" thinking technique,
a) list lots of benefits and problems that the new approach creates;
b) create an explicit criteria for selecting top benefits and problems;
b) according to your criteria, what are the most important short- and long-term benefits/problems (at least one each)?

Question 2 (optional): What dilemma did the researchers solve, while trying to create their genetically modified mosquito?

Question 3 (optional): What's the difference between system levels that the existing and the new malaria solutions target?

Thursday, June 23, 2016

Stanford CSP 74 Principles of Invention and Innovation (BUS 74). Session 1 Quiz 1

Research shows that online privacy remains a controversial topic. For example, a review article from the Science Magazine states*:

If this is the age of information, then privacy is the issue of our times. Activities that were once private or shared with the few now leave trails of data that expose our interests, traits, beliefs, and intentions.
Both firms and individuals can benefit from the sharing of once hidden data and from the application of increasingly sophisticated analytics to larger and more interconnected databases (3). So too can society as a whole—for instance, when electronic medical records are combined to observe novel drug interactions (4). On the other hand, the potential for personal data to be abused—for economic and social discrimination, hidden influence and manipulation, coercion, or censorship—is alarming. The erosion of privacy can threaten our autonomy, not merely as consumers but as citizens (5). Sharing more personal data does not necessarily always translate into more progress, efficiency, or equality (6).

Question: How would an IDEAL privacy system would change the situation.

*Science 30 Jan 2015:
Vol. 347, Issue 6221, pp. 509-514
DOI: 10.1126/science.aaa1465

Direct link to the article (pdf) on

Tuesday, May 10, 2016

Facebook patents recommendations from contact lists

The USPTO awarded Facebook US Patent 9,338,250, titled "Associating received contact information with user profiles stored by a social networking system" (inventors: Michael Hudack, Christopher Turitzin; Edward Baker; Hao Xu). The patent covers the now standard feature in many social networks, both consumer and professional, where the system finds potential connections in your imported contact list and recommends adding a person who is currently not in your network.

From an innovation methodology perspective, the invention solves a typical problem that arises when users need to be migrated from an old technology space into a new one. In the System model, an effective solution improves scalability, by dramatically reducing costs of adding Sources and Tools during the synthesis phase.

tags: facebook, innovation, invention, patent, social, networking, synthesis

Monday, May 09, 2016

Trade-off of the Day: Warmth vs Competence

In Scalable Innovation, we show how breaking, instead of making trade-offs, allows innovators create breakthrough technology and business solutions. It turns out, successful solutions to trade-offs in human psychology can also be beneficial in one's personal or professional life.

For example, here's how people typically perceive others in two psychologically important dimensions - Warmth and Competence*:

Figure 1 Each quadrant represents a unique combination of warmth and competence. The Partner, combining warmth and competence, inspires admiration. Its opposite, the Parasite, inspires contempt or disgust. The Predator and Pet inspire ambivalent feelings: the cold and competent Predator breeds resentment, while the warm and incompetent Pet inspires pity.

As you can see from the diagram, an ideal situations puts one into the upper right corner labeled "Partner", which combines high Warmth with high Competence. But research shows that in real life, people typically judge others in just one dimension and infer the other one through an implicit trade-off:

Theoretically, warmth and competence judgments vary independently, but in practice they are often negatively correlated, so that groups are stereotyped ambivalently as warm but incompetent, or competent but cold — an effect termed social compensation. For example, older people are perceived as warm but incompetent, and regarded with pity, whereas rich people are perceived as competent but cold, and regarded with envy. 
These ambivalent stereotypes are so ingrained that accentuating only one positive dimension about a person actually implies negativity on the omitted dimension — a secret language of stereotypes perpetuated by communicators and listeners. Indeed, the tendency to focus on the positive dimension of an ambivalent stereotype while implying the negative dimension has increased as social norms against expressing prejudice have developed.**

As we can see, even being perceived in a positive light can lead to negative personal and professional consequences. Therefore instead of succumbing to the trade-off, a psychologically-aware problem-solver would have to use one of the separation techniques to break the trade-off and demonstrate both warmth and competence.

I think I'll turn this real-life problem into a quiz for one of Stanford CSP invention/innovation courses.

* source: The Middleman Economy, by Marina Krakovsky
** source: doi:10.1016/j.jesp.2016.01.004. Promote up, ingratiate down: Status comparisons drive warmth-competence tradeoffs in impression management. Swencionis & Fiske, 2016.

Saturday, February 06, 2016

Stanford CSP Scalable Innovation (BUS 134) Session 3, Quiz 1

Autonomous vehicles (formerly known as self-driving cars) can drive safely at fast speeds and maintain short distances between cars, reducing road congestion. Furthermore, electric autonomous vehicles can accelerate and maintain high speeds without dramatically increasing pollution.

On the other hand, human drivers are required to drive under the speed limit and maintain a certain, relatively large, distance between cars, e.g. the Two-Second Rule. Arguably, introduction of modern breaking technologies doesn't reduce the rate of accidents significantly.*

As a result, large-scale deployment of autonomous vehicles creates a situation that involves multiple trade-offs.


1. List trade-offs relevant to the situation (use divergent thinking). Select one (use convergent thinking) that you anticipate to become the most important in the future. What selection criteria did you apply?
2. Propose solutions that can break the trade-off: realistic, futuristic, fantastic, etc.
3. (Bonus 1 - optional). What technology and business opportunities you can create by breaking the trade-off?
4. (Bonus 2 - optional) Using analogical thinking, what solutions from the history of the automobile can you re-use to solve the current situation?

* See, for example, Foolproof: How Safety Can Be Dangerous and How Danger Makes Us Safe, by Greg Ip, 2015.

Thursday, January 28, 2016

Lunch Talk: In 2003 Elon Musk gave a talk at Stanford about PayPal and Space X

From the Youtube blurb:

"Elon Musk, co-founder, CEO, and chairman of PayPal, shares his background: He was accepted into Stanford but deferred his admission to start an internet company in 1995. His company was zip2 which helped the media industry convert their content to electronic medium. Then, he sold the company for over $300 million and never came back to Stanford."

tags: youtube, lunchtalk, innovation, media, space

Scalability: from Neanderthals to Twitter

A quote from "Sapiens: A Brief History of Humankind",

Twitter is having trouble competing for users against Facebook and Youtube because it has failed to scale human relationships beyond the threshold of 150 individuals. That is, the social networking niche of "less than 150" is already occupied by Facebook and for Twitter to become successful, the company has to make it easy for each user to organize and curate information dynamically from thousands of people who are not in the immediate network. Moreover, since connections and information on Twitter is more (10X!) dynamic than on Facebook, the degree of organization of information streams has to be at least 10X more sophisticated as well.

Youtube has met its content scalability challenge by enabling users to create and share playlists, channels, and subscriptions. Every user on Youtube is a developer who produces new ways to access contents at a collection or stream level, rather than at single video level. In Scalable Innovation we call it scaling at the aboutness" layer. So far, Twitter can't find a way to enable its users to become developers. All they can do is propagate gossip, which worsens the information overload problem for everybody who gets over the "150 individuals" threshold.

To summarize, Twitter needs to find a way to help people become better Information Sapiens because the Information Neanderthal niche is already occupied by Facebook and Youtube.

tags:scale, innovation, control, aboutness, twitter, social

Wednesday, January 27, 2016

Stanford CSP. Scalable Innovation (BUS 134) Session 2 Quiz 1

Go is a board game invented 2,500 years ago in China. According to a recent MIT Technology Review (MTR) article, "Mastering Go ... requires endless practice, as well as a finely tuned knack of recognizing subtle patterns in the arrangement of the pieces spread across the board."

Experts have long considered Go as one of the most complex and intuitive human games ever created, much more complex than, e.g. chess or poker. Nevertheless, Google AI researchers have developed a software that "beat the European Go champion, Fan Hui, five games to zero. And this March it will take on one of the world’s best players, Lee Sedol, in a tournament to be held in Seoul, South Korea."

Read the MTR article mentioned above and consider/answer the following questions:

1. Does Alpha Go represent a major technology innovation? Explain your reasoning.

2. If combining two or more deep learning networks, as described in the article, is the wave of the future, what industries, new or existing, would benefit from the technology the most? Why?

3. Using the System Model (Scalable Innovation, Part I), hypothesize what system elements and interfaces still need to be invented to complement or take advantage of Alpha Go-like software.

tags: innovation, course, stanford, quiz

Monday, January 18, 2016

Pragmatic creativity among Chimps, Orangutans, and Bonobos

Unlike us humans, who are still confused about what "healthy" food is, many primates know that "healthy" means developing a habit that separates nutritious food from harmful food. For example, chimpanzees, orangutans, and bonobos, know how to turn dirty apples into clean ones, while gorilla's don't.

Chimps, Orangutans, and Bonobos are pragmatically creative because they've developed a consistent process for dramatically improving health outcomes of a recurring situation.

Source: Matthias Allritz, Claudio Tennie, Josep Call. Food washing and placer mining in captive great apes, 2012. DOI 10.1007/s10329-013-0355-5.

Friday, January 15, 2016

Lunch Talk: (Authors at Google) How New Ideas Emerge

Matt Ridley’s brilliant and ambitious new book in which he explores his considered belief that evolution—in biology, business, technology, and nearly every area of human culture—trumps deliberate and intelligent design.

tags: lunchtalk, creativity, innovation, evolution, scale

Thursday, January 14, 2016

Stanford CSP, Scalable Innovation (BUS 134) - Session 1, Quiz 1

1. Identify at least three trends mentioned in this Bitcoin-related NYT article: A Bitcoin Believer's Crisis of Faith

2. Trends: headwinds and tailwinds
2.1. Name at least one trend that significantly increases Bitcoin's chances for success.
2.2. Name at least one trend that significantly decreases Bitcoin's chances for success.

3. Name major technology innovations that power trends positive for Bitcoin.

Examples of trend categories:

- Business
- Technology
- Science
- Finance
- Demographics
- Social
- Market
- Regulatory

tags: stanford, bus134, quiz, innovation, trends, bitcoin

Tuesday, January 12, 2016

Lunch Talk: (Authors at Google) Learning how to learn math and sciece

In A Mind for Numbers, Dr. Oakley lets us in on the secrets to effectively learning math and science—secrets that even dedicated and successful students wish they’d known earlier. Contrary to popular belief, math requires creative, as well as analytical, thinking. Most people think that there’s only one way to do a problem, when in actuality, there are often a number of different solutions—you just need the creativity to see them. For example, there are more than three hundred different known proofs of the Pythagorean Theorem. In short, studying a problem in a laser-focused way until you reach a solution is not an effective way to learn math. Rather, it involves taking the time to step away from a problem and allow the more relaxed and creative part of the brain to take over.

The creative aspect of learning math and science is somewhat similar to elements of creativity necessary for developing user scenarios in hard-core technology solutions.

Sunday, January 10, 2016

The paradox of "healthy food"

The "Lunch Talk" video I posted earlier today defies a popular misconception that healthy food is expensive. The healthy food confusion is a version of a common human perception that expensive things or experiences are inherently better than inexpensive ones. For example, in experiments with differently labeled wines people report "expensive" as being of a higher quality. In experiments with painkillers, people report that large, colorful, "expensive" pills work better than plain pills. The trade-off between quality and price seems to be fundamental to our understanding of how things work in the world.

Remarkably, there's nothing fundamental neither in nature, nor technology that determines good stuff should cost more than bad stuff. Moreover, major business breakthroughs happen when inventors deliver high quality products and services at dramatically lower prices. For example, Henry Ford created a technology revolution when he introduced Ford-T and the assembly line to manufacture the most reliable and most affordable automobile in history. Before him, people believed that reliable automobiles must be expensive. Similarly, Amazon introduced a business model where a company can inexpensively provide a great shopping experience with lots of choices, knowledgeable explanations, quality ratings and fast convenient delivery. Before Amazon, retailers believed that high quality shopper experience was only possible in high-end stores managed by highly compensated staff. They were proven wrong with dire consequences for their shareholders.

Today, businesses like Whole Foods and Sprouts are built on the assumption that healthy food must be expensive. Leanne Brown's book shows that this trade-off can be broken. As a result, we might see a revolution in many health-related areas, from retail food outlets to obesity prevention apps to government welfare services.

tags: health, trade-off, quality, innovation

Lunch Talk: (Authors at Google - Leanne Brown) Eat Well on $4/Day

Good and Cheap is an NYT-bestselling cookbook [by Leanne Brown] for people with very tight budgets, particularly those on SNAP/Food Stamp benefits. The free PDF has been downloaded more than 800,000 times, and a Kickstarter campaign for an initial print run brought in over $144,000 (it remains the #1 cookbook ever on Kickstarter).

lunchtalk, health, culture

Thursday, January 07, 2016

Will Samsung write you a prescription and deliver your medicine?

At CES 2016 Samsung showed a number of wellness-related products, including the WELT:
The WELT communicates with your phone to tell you how many steps you've taken, how long you've been sitting, eating habits and your waistline size. It then sends the data to a specially-designed app for analysis, to tell you things like -- if you keep eating like you did today, you're going to gain 2 pounds this month. Samsung expects the WELT to go on sale this year.
If the product becomes a commercial success, it's easy to imagine how much historical data the company is going to collect across a broad range of demographic categories. Even if this particular product flops in the market, similar ones, e.g. made by FitBit or Apple, will emerge over time. The key difference between Samsung and others is that Samsung is now getting into pharmaceuticals. Here's a quote from a 2014 Bloomberg article:
South Korea’s biggest company is investing at least $2 billion in biopharmaceuticals, including the growing segment of biosimilars, which are cheaper versions of brand-name biotechnology drugs that have lost patent protection.

“We are in an infancy still,” Christopher Hansung Ko, chief executive officer at the Samsung Bioepis unit, said in an interview. “We are a Samsung company. Our mandate is to become No. 1 in everything we enter into, so our long-term goal is to become a leading pharmaceutical company in the world.”

Remarkably, Samsung has a chance to become the only company in the world capable of gathering real-time biological data, diagnosing diseases and delivering appropriate treatments to an individual at the right time, in the right place and at the right price.

tags: innovation, samsung, health, detection, tool, mobile

Wednesday, January 06, 2016

3D Printing - the new Clay Age

Consider a recent MIT Review article about the latest 3D printing lab experiments. What is their importance to inventors and what can we use to predict evolution of this technology?

When we study history, especially, history of innovation, people conventionally mention the Stone Age, the Bronze Age, the Iron Age, etc. At the core of such descriptions lies a wonder material — stone, bronze, iron, steel, silicon — something that enables a huge range of applications, which power technology developments for decades or even hundreds and thousands of years.

Paradoxically, there's no Clay Age (see fig below).

This is really unfortunate because the clay turned out to be the ultimate material that served us, humans, for thousands of years and enabled us to produce an amazing range of objects and technologies: from bricks to construction and architecture, from jars to storage and shipping, from ceramics to chemistry and modern waterworks, from concrete to skyscrapers and highway transportation systems. From an inventor's perspective, I see clay-based technologies as the first example of what we call today additive manufacturing.

Let's go back few thousands of years and compare stone (Before) and clay (After) as manufacturing materials. If you live in a cave and use stone to make your tools you have to chip away, blow-by-blow, certain parts of the original piece of rock that don't fit your design.

Even when we consider "raw" rocks being cheap and disregard the waste of material itself, our ability to shape the rock or change its internal physical structure is severely limited by what we can find in nature. By contrast, clay is extremely malleable: you can shape it, add filaments, make it hollow, make it solid, make it hard, glaze it, and much more. If you are a hunter-gatherer, by combining clay and fire you can create all kinds of sharp weapons that your stone age competition can't even imagine. If you are a gatherer, you can create jars and jugs, using one of the cornerstone inventions of human civilization: the Potter's Wheel.

If you are a house builder, even a primitive one, you can use mud bricks and reinforce them with straw. As you master fire and masonry, you learn how to create bricks and construct buildings that last decades and centuries, instead of years. You can even print money tokens with appropriate clay technologies! Furthermore, with advanced firing techniques, you discover how to melt and shape metals and discover important alloys, such as Bronze. Ultimately, you develop communities of innovation and economies of scale unheard of in the Stone Age.

Why thinking about the Clay Age is important today, when we are well beyond using mud for building cities? The main goal is to gain an insight into what additive manufacturing can do for us for years to come. Just like clay, 3D printing represents a technology approach with a promising long-term potential. That is, when working with both, clay and 3D printing, instead of removing and wasting extra, we add materials and shape surfaces to achieve desired designs. Luckily, for 3D printing we can leverage the learnings from clay.

Over the thousands of years, humans learned to work with clay by combining 6 key modifying methods:
1. Shape - change the outer geometry (e.g. brick).
2. Thin or thicken - change the inner geometry (e.g. thin jar).
3. Fill - change the inner structure (e.g. reinforced concrete)
4. Fire - modify inner and/or outer hardness or other material properties (e.g. hardened stove brick)
5. Slip - modify or create an outer layer with specific properties (e.g. ceramic glazes)
6. Decorate - paint or other exterior designs to make things aesthetically appealing.

With 3D printing we are still working on items 1 and 2, barely touching 3. Some of the research labs approach item 4 on our list - firing, or its equivalents.  For example, the MIT article that I've mentioned in the beginning of the post uses the ancient sequence of a clay-based technology: shape your piece from a soft material with special additives, then fire in the kiln, to achieve desired hardness and durability. Remarkably, modern 3D printing combines the ancient material — ceramics — with modern design techniques — computer modeling and manufacturing.

In the short term, 3D printing went through a lot of hype that fizzled a bit by now. In the long term,  the age of 3D printing, just like the Clay Age, is going to create a strong foundation for a broad range of human technologies. Basically, we are in the hunter-gatherer stage of our 3D evolution curve.

tags: technology, innovation, history, invention, creativity

Lunch Talk: Nanotechnology at work

A 2015 Nova documentary shows science and tech advances that power applications of nanotechnology in electronics, healthcare, optics, energy, and other fields.

tags: lunchtalk, technology, materials

Tuesday, January 05, 2016

Life sciences vs Computer Sciences - a challenge for the 21st century

Investor Peter Thiel captures the core difference between bio and computer tech in his recent interview to MTR:
This goes back to that famous Bill Gates line, where he said he liked programming computers as a kid because they always did what he told them to. They would never do anything different. A big difference between biology and software is that software does what it is told, and biology doesn’t.

One of the challenges with biotechnology generally is that biology feels too complicated and too random. It feels like there are too many things that can go wrong. You do this one little experiment and you can get a good result. But then there are five other contingencies that have to work the right way as well. I think that creates a world where the researchers, the scientists, and the entrepreneurs that start companies don’t really feel that they have agency.
Unlike computer science, biology doesn't have the equivalent of the Church-Turing thesis that, essentially, guarantees an implementability of a valid algorithm. The success of Silicon Valley is built on top this important discovery of the 20th century. That is, once a "computation" entrepreneur, either in software or hardware, finds a way to express his useful idea in an algorithmic way, he or she can be sure that it will work, provided the computational power, storage, and networking capacity grow exponentially. Most famously, Larry Ellison created his Relational Database business in mid-1970s when people did not understand implications of the Moore's Law yet.

Biology is different. Vernon Vinge, a science fiction writer, aptly calls our future successes in medicine "A Minefield Made in Heaven" because it's hard to predict the specific locations of magical "mines" that we are going to discover and cure various diseases.

Peter Thiel uses word "random" to describe biology; but from a practical perspective it's actually worse than that. If it were random we could use known randomization techniques from computer science and make new biological discoveries by almost brute force. We can't. Therefore, I'd rather use a different term – arbitrary, and there's no algorithm for generating useful arbitrariness yet - only human ingenuity.

The good news is that some of the life sciences fields are compatible with computation. We are going to make a lot of progress in areas where we can hook up analog biological experiments to the exponentially growing computing platforms. Diagnostics and pattern matching for known problems seem to be the most promising field.

tags: biology, innovation, science, technology, silicon valley

Monday, January 04, 2016

Lunch Talk: (at Stanford) What they don't teach you about entrepreneurship

Part of 2010 Conference on Entrepreneurship at Stanford Graduate School of Business.

Description: A group of entrepreneurs talk about what they learned in the trenches that they never could have learned in a classroom. The panelists will also share the courses that were most helpful to them in their entrepreneurial ventures, the courses that they wished they had taken, and the topics that business schools should be teaching to aspiring entrepreneurs.

Sunday, January 03, 2016

Discipline and Punish, 21st century style

My morning twitter feed brought together two seemingly unrelated articles:

1. The MIT Review: overview of Robotics Trends for 2016.
2. The Economist Economist: article on the disappearance of middle managers in 2016 and beyond.

To get an insight into long-term implications of the trends, first consider a quote from each one of them separately:

The Economist,
Existing systems will be replaced by new ones built on more fashionable qualities: speed and transparency. Companies will stop fussing about inputs (how people do things) and focus only on outputs (what they produce). They will be obsessed with data, losing all interest in anything that can’t be measured. Every employee will be monitored every second; every keystroke and click will be tracked and analysed. Some companies will go further and get white-collar workers to wear sensors that track all movements and measure their tone of voice and the number of steps they take.

The MIT Review,
Another trend to look out for this year is robots sharing the knowledge they have acquired with other robots. This could accelerate the learning process, instantly allowing a robot to benefit from the efforts of others (see “Robots Quickly Teach Each Other to Grasp New Objects”). What’s more, thanks to clever approaches for adapting information to different systems, even two completely different robots could teach each other how to recognize a particular object or perform a new task (see “Robots Can Now Teach Each Other New Tricks”).

The Economist talks about tracking and analysing employee performance data, including its physical aspects; the MIT Review describes a scenario where robots teach robots. Now, consider a case where we mix and match the two scenarios. That is, data obtained from monitoring humans (Economist) is used to teach robots (MIT Review). The combination would enable an easy transition from lab prototypes and small-scale production created by humans to large-scale factory in robotic factories. Ultimately, it'll speed up innovation but will make lots of workers redundant.

Saturday, January 02, 2016

The new Digital Divide

The New York Times shows how mobile app designers devise new ways to get teenagers' attention during the day,
Push notifications — those incessant reminders that make your phone light up and ding — are the infantry of app warfare, cracking the attention span to remind users that someone on the Internet might be talking about them. All summer Wishbone had been sending out alerts four times a day, but the three men were thinking about adding more and, now that students were back in class, trying to recalibrate around the school day. 

“Can we have a friends feed at noon?” Mr. Jones asked Mr. Vatere. “It would be great to do ‘Your friends have updated.’ ”

“And you talk about it while you’re at school,” Mr. Pham added.

What are the implications: not for the business and advertisers, which the NYT article discusses, but for the kids, their families and the society at large?

We already know that frequent interruptions worsen kids' learning performance. We also know that pre-teens and teens are becoming addicted to their mobiles. Given that well-funded and market-savvy mobile app developers create new ways to target kids during school hours, we can predict that there will be a learning gap between kids who can manage their mobile distractions and those who cannot.

The old Digital Divide existed between people who had online access and those who had not. The underlying assumption was that the former were better off because they had access to all the information information needed to learn effectively.

I believe the assumption is no longer valid. Having access to the internet all the time is becoming detrimental to learning. Arguably, it's worse than television because kids get bombarded with distractions and advertisement all the time, rather than during the leisure hours.

The new Digital Divide is going to emerge between those who can manage their online time and those who cannot. Online learning may even broaden this divide because it will provide the motivated with greater opportunities to excel. Most likely, we already seeing signs of things to come through the low completion rates in virtual universities — 3-5%: few get huge benefits, while the majority does not. Paradoxically, online learning has become a natural selection environment for the next generation of schoolchildren addicted to their ubiquitous social interactions.

tags: psychology, mobile, learning, virtual, media, advertisement