Approaching Innovation
“Innovation” is a word that gets thrown around a fair amount in today’s economy. Companies dedicate entire positions to managing innovative processes, cities hold large conferences to discuss how to best foster innovation in their respective regions, and theorists study innovative practices to develop frameworks for innovative product development. While many people believe the innovative process is born purely from moonshot ideas and elbow grease, the amount of work done in modern industry to formalize the process may beg to differ.
There’s no debate about the importance of innovation to any business. It’s the Innovator’s Dilemma: a business can do everything right, make good managerial decisions, and still go under due to an emerging disruption that supplants the current offerings’ place in the market. If a business wishes to stay relevant in any market subject to upheaval (read: any market), they must find a way to maintain agility and invest in technologies that may not provide immediate returns, but represent the future of the industry. Historically, businesses that can’t provide this sort of adaption go under, while their more capable competitors ride the wave of the new technology right to the top.

Because of this dilemma, a few frameworks have been developed to manage innovation. The most prevalent way people perceive innovation is referred to as the ideas-first approach, as coined by Anthony Ulwick in his paper, “What is Outcome-Driven Innovation?”. Under this approach, innovation is done simply by brainstorming a myriad of jumbled ideas and pursuing their creation to see if they are feasible products. This mentality is supported by many companies, who hold formal brainstorming sessions to come up with these ideas, and attempt to filter out the bad ones quickly and efficiently in in hopes of happening upon a breakthrough by process of elimination. However, in established companies looking to stay relevant in the face of a volatile industry, this exercise is not targeted enough to be able to provide consistently viable results.
Ulwick’s paper goes on to talk about the concept of needs-first innovation, and how the Outcome-Driven Innovation (ODI) model gives structure to this approach. In needs-first innovation, products are created by examining the needs of the customer, figuring out which of those needs have not been met by current market offerings, and developing a product to solve these problems. This sounds like a much more solid process to follow, but Ulwick still insists that this idea is structurally flawed on its own.
To put a more established process behind this idea, Ulwick developed the ODI framework, focusing on the concept of jobs-to-be-done. Under this school of thought, innovators must think of the job that a consumer needs done as the unit of analysis: in other words, the consumer hires the product for a certain job, and if that job is not done adequately, the consumer will not buy or use the product. This is a more targeted way of looking at customer needs than just asking them for their issues with current products, as simply harvesting these concerns can result in short-sighted improvements rather than an innovative solution.

By considering the jobs a customer needs done, teams can do more than what is referred to as “scattershot brainstorming” and instead come up with targeted ideas for serving customer’s needs. By knowing more about what the customer explicitly needs to do, companies can focus their creative efforts on cultivating a novel solution to a core problem, while others without this focus may find themselves wasting resources and failing to adapt in a changing market.
With all of this being said, it’s important to realize that there is still value in the traditional idea of ideas-first innovation. In particular, startup businesses have the agility and general volatility to pursue the potentially revolutionary ideas that established businesses may not be able to afford investing in. I fully stand behind the belief that it’s always worth attempting to act on an idea and create/learn in the process in lieu of wasting valuable time internally debating why a idea would or wouldn’t work. Though it’s often impossible to invest this sort of effort in the professional world, those with the time to spend and the desire to grow should always be seeking to innovate and create in any way they possibly can.
What Can The Technology Industry Learn From Art?
Over the past few years, I’ve studied a field known as arts entrepreneurship. What this means is that I’ve been studying how people perceive and value art, and learned how to start and maintain an effective business in the arts sphere. These practices result in a different business mentality than what I’ve been used to working in technology, and learning how the arts economy works has been incredibly valuable. Over time, I’ve been thinking about how these practices can be applied to the tech industry to let innovative products succeed where traditional business practices would fail.
(A quick side note: When I mention to others that I want to merge the technology and art industries, many people think I’m just referring to industrial design. What I’m talking about isn’t the idea of just bringing more aesthetic value into how products look - it’s about the way you market the product, the way people use it, and the way the value of an item is perceived. It’s more about creative direction and properly manifesting a vision/idea than it is about just creating an item.)

One of the lessons I’ve learned about these two economic spheres is how a product’s value is perceived by the mass market. In both industries, there are two broad categories of target markets: creators and consumers. In art, the creators are the artists themselves, and the consumers are the people who purchase works of art. In technology, software developers largely fill the role of creators, and everybody else who utilizes technology as an end user is a consumer. The arts market generally produces entirely separate products for creators and consumers, while in technology, facets of the same product are presented to the two groups in different ways.
If we look at the creators in both categories, we can see how products are presented differently in each sphere. Products marketed to artists are always presented as a vector through which they can create their art, rather than simply something to play with and try out. This mentality is most obvious in visual art, as the packaging and advertising for products visual artists use are always populated by other pieces of visual art, with less of a focus on the tool itself. In some other arts, the distinction is less apparent (particularly with music, as the end result of a product can be harder to convey without audio), but if you look carefully you can still see these practices taking place in most artistic products.
In technology, the specs of a product are most always presented front and center, and there’s a much clearer mentality of purchasing something to “tinker with” rather than having a clear end goal in mind. Rather than viewing the product as something which serves a clear purpose, tech gadgets are often presented as a collection of cool features with an open platform for development, asking the community of creators to help define an explicit use case for the product rather than portraying a use case front and center.

Now, let's look at the end consumers for both economies. In the arts, the end product isn't often a utilitarian item: in other words, people normally don't go out looking for a painting with a very specific size, color balance, or brush technique (interior designers aside, of course). Instead, sales normally happen when a work's aesthetic value resonates with someone in the right way, and they decide that they want it. There are exceptions to this rule, of course, but there is a very defined contrast in buying patterns when it comes to pieces of art against technology products.
In the tech world, every end product is bought to fill a need. Cell phones are looking to be the fastest and have the longest battery, laptops need to be compact and powerful work stations, and even watches now battle to provide the most relevant information for the best price and form factor. It's very rare for someone to see a product in passing that they don't need, and instantly purchase it because it struck them in just the right way. That may be because technology often exists at a higher price point than (popular) art, but it still stands that technological purchases are almost always heavily premeditated and need-based consumer decisions predicated on a heavy weighting of competing options.

Why does this distinction matter? Two words: disruptive technologies.
When innovative products emerge, there often isn't a predefined use case for them, however revolutionary they may be. Historically, incremental disruptions have taken hold because their manufacturers found a niche use case for them through which they could continue to grow and develop the product until it could meet the needs of a mass market. However, in consumer markets, it's very difficult to find these niche users and adequately meet their needs when they are set in using their older, more proven technologies that may share much of the same functionality. For this reason, companies with these technologies can greatly benefit from leveraging an artistic/aesthetic method of marketing and positioning their products. Though I hate to fall back on Apple as an example, they have successfully leveraged this thinking multiple times, introducing disruptive products such as the iPod, iPhone, iPad, or pretty much anything that begins with a lowercase "i."
This is just one of the many ways that technology can learn from art. The economies associated with each product class are starkly different, and there are many more lessons that technology could learn from the art world, particularly when it comes to introducing new and innovative products. By merging these two mentalities, we can help to create a world where emerging technologies can let people accomplish consistently greater feats.
Should You Be In Silicon Valley?
Recently, I returned home from an internship in Silicon Valley. Having spent the vast majority of my life in Raleigh, North Carolina, this was an entirely different experience than anything I had ever been through before, and it really opened my eyes to just how different the Valley is from the rest of the country. There are immense opportunities on the West Coast that don’t exist outside of the infamous technological center of the United States, but there are also many aspects of the area that cause me to question whether or not I would like to live there. As such, I thought I’d share my experience for others who might be in the same boat.
Let’s start with the good: SV is rife with opportunity for technology startups and emerging businesses. An enthusiasm for entrepreneurship seems to have infected the entire populace, with so many new apps and ideas being thrown around that it seems like every engineer is busy working on (or simply thinking of) the next big thing. Feeding into this, investor opportunity is everywhere, with venture capitalists searching for promising new businesses and incubators looking for teams to help churn out revolutionary new ideas.

Beyond that, the Valley is an engineer’s paradise if for no other reason than the people who live there. In my experience, most (if not every) engineer who works in the area is truly passionate about what they do, and there are copious opportunities for these people to pursue to technologies or play with new ideas. There seems to be some sort of “hackathon” every week put on by some prominent company, and these events are well attended by like-minded people with broad passions and expertise. These, combined with frequent talks about new technologies/practices from industry leaders, form a networking utopia.
Even for non-engineers, the Valley offers a wide range of cultures, giving all kinds of people a place to thrive. San Francisco in particular offers a broad scope of things to do on the weekends, whether you’re a party animal or simply looking to explore different walks of life.
On top of all that, the whole “always 70 degrees and sunny” thing is pretty nice too. I will say that towards the end of my time there, I did miss being able to sit by an open window or on a screen porch to enjoy a heavy rainstorm.

Now, on to the negatives: Silicon Valley is, without a doubt, a bubble. The best way I’ve heard it described was as an “echo chamber”: once one idea gains traction, everyone and their brother is working on an imitator or the same exact technology, looking to make some money while the trend is around. Many ideas proposed in the Valley really don’t solve a broad problem that people outside the core Valley demographic experience, and simply further the notion that Silicon Valley doesn’t really care about the outside world. And yet, investors pour money into them like there’s no tomorrow. It feeds into a sort of arrogance about the area, where residents believe that those outside of Silicon Valley are beneath them somehow, and that there is little value in doing things outside of the software space.

Much of the technology that has emerged within the Valley has been hurtful to the community at large. Some startups will make a business out of reserving public goods, such as parking spots and restaurant reservations, and offering them at premiums through their applications. This ends up boxing many people out from these goods and results in empty, unpurchased spots/reservations that hurt the owners of these establishments. This phenomenon has gotten a fair amount of media attention, but so far, there doesn’t seem to have been a huge response to such things.
There’s also the cost of living. Granted, the salaries software engineers make out there tend to make up for the difference fairly well, but those in pretty much any other profession are likely going to find themselves strapped. Apartments in the area are ludicrously expensive, with a one-bedroom apartment in San Francisco easily costing several times what it would in Raleigh (as an example). As such, most people living out there have to sacrifice apartment quality if they want to find a reasonably priced place to stay.
Transportation is also a hassle out there. While many companies have shuttles to take their employees to/from work at certain times, people living in unserviced parts of the Valley have to deal with an immense amount of traffic. This is true for any densely populated area, however, and is more of a minor concern. (Plus, the fact that there are intergalactic spaceboats of light and wonder everywhere lends some welcome eye candy to the commute.)
Basically, if you’re in the situation where you’re choosing whether or not to live in the Valley, ask yourself if the positives outweigh the negatives. It’s a fantastic place to be for developing engineers, and there are tons of like-minded people starting careers in the area. Living in Silicon Valley is an experience you aren’t going to find anywhere else in the world: if the area appeals to you, you won’t regret your decision.
Has Music Been Commoditized?
The digital revolution has done a number of great things for the music industry. In recent years, music has become increasingly accessible, both in the way we purchase and consume the art. The first incarnation of the iPod/iTunes ecosystem was a major breakthrough on both these fronts, with services like Napster, Rhapsody, and Grooveshark changing the game for music ownership and discovery. SoundCloud has provided a simple and straightforward way for people to share and discover independent artists, and Shazam has paved the way for music discovery/identification in the real world.
Still, for all the advances that have been made, the music industry is in a tough spot - for musicians, at least. A documentary called “The Distortion of Sound” highlights some of the troubles regarding how an artist’s work is received by their audiences. Though it misrepresents some of the technical details of the process (which is understandable, as it aims to show differences in sound quality using YouTube as a vector) it does portray the way art is filtered to adapt to the new models of consumption: valuing storage space and pure volume over sound quality and dynamics. Many of the subtler nuances artists instill in their works are never heard by most audiences, whether that be due to subpar listening equipment or albums distributed in compressed and adulterated form.

Even beyond sound quality, much of the experience built around an artist’s work has been lost to convenience. Consumers used to go to the record store and pick up a CD, reading the album notes and establishing a connection with the artist themselves. We also used to listen to complete albums more than only selecting specific tracks, if only due to the fact that changing albums was a hassle.
The many factors working against experiencing recorded music in its purest form bring about an inconvenient truth: recorded music has become a commodity. Even the newest music is abundantly available through a wide range of channels, including YouTube, Spotify, and Pandora. The concept of being able to own music (though that talk has always been bogged down in licensing technicalities) has increasingly lost relevance, and there is no longer any value in having access to a specific song over any others. The economic value does not exist in the tracks themselves, but in the services provided on top of them - most notably, streaming.
If the music industry is going to evolve again, there will have to be some other value built on top of recorded music. Whether that value is as simple as added insight into the production process or something more experiential, something else needs to be introduced to move the industry forward. Ideally, whatever this is could bring some value back to the artists as well, since the industry is currently not fair to those producing the art by any means.