The Story Behind Smart Cities, Big Data, and Sustainability

Sustainability is hard to come by. Whether it’s a sustainable organization, a sustainable way of life, or a sustainable relationship, building something that lasts is tricky.

Take a sustainable city, for instance. The modern outlook is that a city is sustainable when it’s smart, and that it’s smart when it uses technology to address problems like traffic, security, pollution, and waste management.

Building a smart city takes work, funding, infrastructure, and technical expertise. Distressingly for us Filipinos, it also takes political will and progressive thinking to get the ball rolling to the end; two things we aren’t exactly swimming in at the moment.

Many people think that all it takes to achieve widespread sustainability are good politics, advanced technology, and big capital. It makes sense to say that, at a glance, sustainability is all about resource abundance.

But that would only be telling half the story.

On the road to sustainability, your starting conditions are a major factor. However, what you have at the onset isn’t nearly as impactful as what you do with it.

There’s a method behind sustainability, and that method depends on data.

The Problem: Really Shitty Foresight

In the 1880s, the world’s most advanced cities were covered in horseshit.

I mean this literally.

Horses were essential to the urban world, once upon a time. They were widely used to transport goods, and they played a central role in early versions of public mass transit. In fact, towards the end of the 19th century, New York was home to over 150,000 horses: horses that answered the call of nature frequently, and in great volume.

As you’d expect, the amount of urine and manure produced on a daily basis brought cities to the verge of a crisis. The air was rank, vermin were thriving, and public health was at risk; even food security was threatened by the agricultural strain of feeding enough horses to support the craze.

photograph of a brown horse's butt
Pictured: the price of progress, apparently. | (Image from Paul J Everett on Flickr)

Luckily, our cities managed to avoid the equine apocalypse thanks to urgent administrative action and the rise of the automobile. City leadership addressed the symptoms of the ill, and the problem was eliminated by the eventual shift in demand from one kind of horsepower to another.

For a miraculous moment, the world shifted away from an unsustainable and environmentally disastrous method of transporting goods and people.

The world did this by embracing the combustion engine.

Boy, did that go well.

Nobody at the time could have guessed that their four-wheeled saviors would end up contributing to another catastrophe at a wider scale. Likewise, the scientific community at the time had no reason to assess the risk of global environmental degradation.

The moral of the story is that we often find ourselves worse off than when we started because none of us are capable of peering into the future, and therefore committing to solutions that are inherently flawed.

Despite our best efforts, we’re just naturally inclined to suck at predicting the future, especially when it comes to predicting a future that works against us.

The struggle to solve problems with finality is universal, but some of today’s cities have managed to do better than replacing broken things and systems —they’ve come pretty damn close to sorting certain things out for good.

Bright Lights, Smart City

Like Taleb’s black swan, a handful modern cities stand defiant of the fear that our answers are only ever bound to create more problems.

A few examples follow:

  • Smart grids allow cities like San Francisco to track energy consumption across time and geography, and make calculated adjustments for the sake of greater efficiency. SF residents can monitor their energy consumption in real time.
  • Traffic lights in cities like Pittsburgh independently calculate the optimal green time based on real-time traffic data, and coordinate with the rest of the city’s lights to keep commutes efficient.
  • A number of cities in Europe have outfitted their public parking spaces with sensors that send data on slot availability straight to motorists. The deployment of smart parking systems reduce traffic, save time, and preserve the environment all at once.
aerial photograph of san francisco in the evening
San Francisco gets it.

There are countless other examples of processes, systems, and devices that make urban living more bearable and more efficient. Upgrades to city life abound, and they range from the novel to the life-saving.

Needless to say, this is all a step up from drowning in excrement.

Smart city innovations present no visible drawbacks—but then again, neither did automobiles at the turn of the 20th century. Unlike their predecessors, however, the minds behind today’s cities have a distinct advantage: access to sprawling datasets and figures.

Data Matters (A Lot)

Any entity or organization with access to data and the expertise to use it is in a good position to keep their solutions from turning sour. There are various reasons for this, but for the sake of this article, we’ll focus on how data helps us see the future.

Data allows for the creation of predictive models and simulations. The world would be better with a little more foresight, and data can help us make smarter projections. This isn’t to say that predictive modeling is 100% accurate (you won’t find that kind of certainty anywhere in big data), but at the very least, it can indicate sharper decision-making than a shrug and a guess.

This takes into consideration the fact that many of tomorrow’s problems would be unthinkable today.

Smart cities provide an example of what being data-driven looks like in practice; in the world’s most advanced urban centers, nearly every data point that could be logged is being logged. This means that there’s a good chance an analyst could piece together the causes of a negative externality and arrive at a sound, statistically-backed recommendation about where to go next.

black and gray data mining rig servers
Server server on the wall, what the hell is going on?

Let’s play with the example of smart parking. Say that one day, we discover that a component of the sensors used to gather parking slot availability data reacts with rainwater to produce a toxic gas.

(It’s a wild idea, sure, but it would have been just as hard to convince someone in 1890 that cars would one day push us closer to the end of the world. Just bear with me for a bit.)

If it turns out that our parking sensors could end the world, then we’re in luck. Since collecting data points factors heavily in their use, we can tell where these deadly parking sensors are, and which parts of our cities are at most risk of a toxic fiasco (think: occupancy rates).

Correlate that data with rainfall predictions, wind forecasts, and other numbers that state agencies are monitoring by default, and you end up with a very manageable disaster situation.

Not a perfect example, but you get the principle: obsessive data collection and analysis can make headaches significantly easier to deal with.

At this point it also bears mentioning that even the most banal data project can be used for nefarious purposes, causing far worse problems than they were meant to solve.

An article published by The Guardian in 2014 argued that smart cities would be the death of democracy, and though I hesitate to agree outright, they raise valid concerns about privacy in an age where personal data is caught in a political and economic free-for-all.

Approach data with caution, and stay ethical.

Conclusion: Necessary But Not Sufficient

Today’s smart cities are far from perfect, but they follow a framework that brings us closer to real sustainability. They weave data into the very core of their solutions, allowing for more accurate prediction and faster responses to sudden complications.

Data in and of itself is not a cure, nor is it ever the only piece of a solution. On top of an obvious need for ethical guidance, you still need the effort, the funding, the expertise, the infrastructure, and the collective determination to correct society’s problems.

However, throwing data into the mix can drastically increase a solution’s chances to succeed, and keeps them from blowing up in people’s faces. A culture of data reliance lets us predict, assess, and improve faster than ever before in our history.

With data on our side, we stand a better chance of getting stuff right for the long-term.

Embrace it, apply it, and demand it. To settle for anything less nowadays is a load of horseshit.

black horse looking at camera, landscape in the background
Never again.

Survival of the Sh*ttest: 3 Reasons Corporate Mediocrity Endures

Business is a constant process of innovation and adaptation. With business technology changing every handful of years, staying competitive means keeping abreast of trends and best practices.

If you’ve been working for at least a handful of years, then you know that the sensible thing for a business to do is to stay competitive.

If you’ve been working in the Philippines for at least a handful of years, then you know that there’s always one crap reason or another for businesses to stick to the Old Ways.

People around a bonfire in a cave
“This is fine.” (Image from Blogspot)

There’s a very good chance that the business you own or work for sucks (even just a little bit) for reasons concerning innovation. It isn’t necessarily anyone’s fault; after all, history tells us that structural change isn’t a natural inclination for our species.

Yet, like my mother would always say, “We build homes to perfect them over time, anak. So please, for God’s sake, clean your f*cking room.”

Or something to that effect.

Let’s have a look at the common organizational roadblocks to innovation–things you’d likely hear when someone shuts down the idea of opening up a new department, or signing up for a helpful new training program.

Reason 1: “If it Ain’t Broke, Don’t Fix It”

The first roadblock to innovation is the “If it Ain’t Broke, Don’t Fix It“ mentality.

You’ve heard it before: the excuse that your department or company has a perfectly fine way of doing things, so there’s never a need to shake things up.

Heck, you might even be making a lot of money sticking to the status quo, and besides: if growth is at a plateau, then it’s your own damned fault for not working harder.

To the Luddite, improvement isn’t worth the effort for so long as the status quo gets the job done.

While things could be quicker and easier for the people who need to slog through an outdated way of doing business, the current model is here to stay for as long as it isn’t impossible to work with.

Example: File Storage

Here’s a history lesson for the kids reading: before Google Drive, people had to rely on local file storage—that is, keeping all your documents and media stored on one or more computers. If people needed to access their work files from home, then they had to carry around USB flash drives or email themselves before clocking out.

They were dark times indeed, but humanity trudged through to a beautiful, cloud-hosted future.

Or, most of us did, anyway.

Some of today’s businesses continue to rely solely on local file storage.

In a handful of truly surprising cases, businesses with the funds to shift to cloud storage refuse to digitize their files at all, opting instead for clunky filing cabinets full of critical data and information (they’d better pray that there isn’t a fire hazard anywhere near).

One such business, pictured here. Probably.

If you’re dead set on introducing your workplace to a key innovation, find a way to get it into people’s hands without being insubordinate.

Find a compelling product review online, download a free trial on your personal laptop, or submit a copy of your next report made using whatever tools or formulae you’ve got your eyes on.

It’s like Prometheus bringing fire to mankind, without the punishment of being fed to ravenous birds.

Reason 2: Cost

The second possible reason why your company sucks is because it’s afraid of cost. People tend to be skeptical about the value of an investment –whether it’s in opening a new department, or adopting a new practice or technology.

Not every business can afford to upgrade every quarter, and that’s well and fine. But companies and departments with the resources to make a much-needed change are a far too common sight, much to the dismay of professionals who wish they could do more on the job.

In many cases, vital business upgrades are seen as frivolous expenses or risky gambles. The reality is that many investments like cloud hosting or a robust data infrastructure are well-known for their steep ROI.

Upon closer inspection, the cause of this line of thinking is closer to ignorance than it is to frugality.

We can understand being skeptical about the merits of today’s chatbots or even blockchain, but people continue to second-guess technology that’s been propelling businesses since 2005.

It’s rock-bashing traditionalism at its finest, and it’s holding our enterprises back.

Example: Working With Data

Every business can benefit from working with data.

That’s not a sales pitch, it’s a fact: when used properly, having a functionalthe data value chain can help an organization eliminate guesswork, predict business outcomes, and reach new heights of efficiency to the tune of over $10 in ROI for every dollar spent.

Data analytics, however, isn’t a simple matter of plug-and-play: it takes time, money, and mental bandwidth to build an infrastructure and collect the kinds of data that can be used for significant improvements.

Plenty of businesses are content with letting their data rot in filing cabinets, and any extra spending is taken as a deal-breaker. Apparently, they missed the memo that it takes money to make money.

“Like out of five stars??”

Communicating the benefits of an investment can be difficult, especially when trying to convince someone who clearly wants to remain skeptical. Our best advice is to be patient, be systematic in making the case for adopting a key innovation, and be persistent.

One way or another, you’ll either find yourself delivering the right pitch to get it through your superior’s thick skull, or find yourself killing time as you wait for your next career shift to greener pastures.

Reason 3: The Learning Curve

By its nature, innovation comes with a learning curve. For many adopters, it’s a necessary speed bump on the road to better business. For others, it’s a brick wall.

Organizations that reject innovation on account of the learning curve fall into one of three categories.

First, you have groups that can manage the challenge, but can’t be bothered to throw in any effort (the lazy).

Second, you have groups that can’t manage the challenge because they’ve hired people who simply aren’t up to par (the inept).

Finally, there are people who could learn but lack the funds or skills to hold training sessions (the unfortunate).

Example: Automation

It’s possible to automate common business tasks like managing files, organizing schedules, and even taking phone calls. If you name it, the chances are good that someone’s already taught a computer how to do it for you.

Of course, introducing a workplace to automation comes with an adjustment period –and the fact of that alone is enough to discourage some local businesses from literally kicking back and letting manual tasks perform themselves.

If you find yourself among the unfortunate, know that there are plenty of affordable training options for many of the latest innovations. You don’t need to cut a department to teach another how to work with automation software.

If you’re surrounded by the lazy and inept, however, we wish you all the best. There’s little to do when your managers and team members lack the initiative or mental power to stay ahead of the curve. Your best shot is to perform well, and keep your eyes open for a smarter career setting.

Case in Point: Blockbuster vs Netflix

In 2004, the American video rental service Blockbuster reported six billion dollars in revenue.

In 2010, Blockbuster declared bankruptcy.

Today, in 2019, your youngest work colleagues would scratch their heads at the very idea of renting videos only to return them to a store after some time has passed.

Conversely, there’s barely a soul alive today who hasn’t heard of Netflix, which is probably why most people cringe hard when they learn that back in the year 2000, Blockbuster declined an offer to purchase Netflix for a measly $50 million.

I’d bet my subscription that you or the people at your workplace feel for Blockbuster. Nothing was broke that needed fixing, $50 million is a hefty sum by anyone’s standards, and who the hell could have explained video streaming back in 2000 anyway?

Before you leave a flower on the grave of the world’s greatest home entertainment failure, know this: when Netflix tried selling itself to Blockbuster, the young streaming company was already raking in billions in revenue.

They knew they were sitting on gold, and the signs of continued success were plain as day. Why they even tried to sell themselves off is beyond me, but whatever motivations each party may have held, the outcome is clear: Netflix knew how to ride the tides of change.

For Blockbuster, innovation was optional, until it suddenly wasn’t.

Conclusion

Things aren’t all doom and gloom, and believe it or not, you can teach an old dog new tricks. In our Business Analytics Masterclass, we teach some best practices and techniques so that you can be more hands-on with your data.

We guarantee you’ll leave armed with the tools for transformative innovation.

Bonus points if you can bring any knuckle-dragging supervisors along with you—we have a thing for communicating the power of innovation to people who don’t want to hear it.

Data Scientist or Know-It-All?

The importance of domain expertise in data practice.

There’s a short yet wonderful story that perfectly encapsulates how many of today’s businesses use data. It’s a simple parable that all of us can learn from, regardless of our background, level of experience, or field of practice. In fact, if you’re already working with data, you might have a similar story to share. It goes like this:

A data scientist holds up a chart.
Everyone believes him.
End of story.

In today’s data-supercharged world, data is the law and the data practitioner is taken as the de facto expert. Ignore the fact that Ben just got hired last week—he has a MA in Statistics and a PhD in Machine Learning, so he must have all the answers, right?

(To clarify, said Ben is a hypothetical person. If you happen to know a Ben or are one, we apologize in advance. If you happen to be a woman, please don’t take our usage of a traditionally male name as a vote in favor of the patriarchy. This is purely for emphasis. We support all women, especially women in data. With everything cleared up, let’s get back to the matter at hand…)

Of course people will listen to the data guy. Numbers are compelling, especially when presented in chart form. Who are we mortals to question an interactive, multi-colored bubble chart? What power does one man hold over a regression line with an R-square well over 0.90?

In the modern boardroom, data is gospel truth. Everything else is mere conjecture.

We seldom stop to consider whether the data is flawed, or if the data guy understands the subject matter enough to draw insights or conclusions. Maybe the regression model is accurate, but what if it uses the wrong variables, or maps out the wrong features? What if the chart displays absolute figures in places where a logarithmic scale is more appropriate? What if the time series shows periods that are either too long or too short? What if the final analysis is inconsequential to the use case at hand?

When all is said and done, data can be just as flawed as the people who work with it.

Consider the infamous case of the NASA’s $125 million Mars Climate Orbiter. A simple conversion mishap—the failure to convert pound-force (lbf) to Newtons (N)—had the spacecraft flying within 37 miles of the Martian surface, dangerously below the 53-mile minimum. What followed was an epic fail of astronomic (no pun intended) proportions: Mars’ atmospheric friction burned the poor thing to a crisp before hurling its ashes deep into a cratery abyss.

Eyewitness reports allege the fire started with a contentious bar chart

Crash and burn—or, rather, burn then crash.

Mind you, this blunder happened with Lockheed Martin’s and NASA’s top brass, arguably the best domain experts in their respective fields, on the job. If even they can make mistakes like this, what makes us think we regular folk are exempt?

The next example is more down-to-earth… literally.

Applying domain knowledge could be as simple as choosing between a FIFO (first-in-first-out) and LIFO (last-in-first-out) approach, as detailed in this SuperDataScience podcast.

To explain FIFO and LIFO briefly: If element A arrives first, B second, and C last, FIFO dictates that they leave in that same order. LIFO is the complete opposite, wherein the last element, in this case C, leaves first, followed by B then A.

As you might already predict, the “right” choice varies greatly among industries.

For example, a business dealing in perishable goods like vegetables or fresh meat might prefer a FIFO approach, wherein an earlier element, say Monday’s shipment, is sent out before Tuesday’s or Wednesday’s. Conversely, a steel manufacturer may opt for convenience and use a LIFO approach wherein the steel bars at the top of the pile (i.e. the last ones in) get shipped out first. Caveat: we are experts in neither the perishable goods nor steel industries, so this is, again, purely for illustration purposes.

Yes, data skills can be applied to nearly every domain. However, we cannot discount the fact that data practitioners need domain expertise in order to truly be effective (or at least to avoid $125 million blunders). Data in retail can differ from data in healthcare or economics or agriculture or any other industry.

This is no different from other jobs. In the same manner we demand industry experience from management professionals and sub-specializations from doctors and engineers, we need to push for domain expertise and domain knowledge in the data practice.

What does this entail?

For the data practitioner, this means building years of experience and knowledge in a specific domain. Go deep rather than broad.

For the company looking to fill a data position, this means hiring a data practitioner with an industry background, or grooming one from the existing workforce (the second is an option we highly encourage).

For schools and institutions offering data courses, this means creating industry-specific courses and tracks, or encouraging students to pursue a minor in a field of interest.

Parting thoughts

Data does not exist in a vacuum and neither do data experts. To make data impactful, we need to encourage data practitioners to look beyond the spreadsheet and out into the real world.

Doing so might just save all of us from another epic crash and burn.

AI vs. Creativity

Why the “Fight” Only Hurts Marketers.

“Do our computer pundits lack all common sense? The truth is no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works.”

The above is a quote from author Clifford Stoll.

You’ve likely never heard of him, and here’s why:

Good Old-Fashioned Naysaying

The early days of the internet were fraught with imperfection. Web development was in its cluttered infancy, e-commerce was a matter of distant science fiction, and searching for information was a nightmare.

Like any new technology, the internet was a bit of a fixer-upper.

But the reason why you’ve never heard of Clifford Stoll is because Clifford Stoll, like many of today’s influencers, spent a lot of time kicking the newborn invention. Where some worked to improve, he spoke to diminish. Where some saw potential, he saw an opportunity to take shots at those who saw potential.

When Nicholas Negroponte, then director of the MIT Media Lab, predicted the rise of ebooks and digital news media, Clifford Stoll called bullshit. When asked about the future of online commerce, Clifford Stoll scoffed and said, “Even if there were a trustworthy way to send money over the Internet—which there isn’t—the network is missing a most essential ingredient of capitalism: salespeople.” I wonder if he’s ever shopped on Amazon.

(Today, Clifford spends his time selling blown glass Klein bottles out of a basement crawlspace in his home. Ironically, none of us would even know that if it weren’t for the internet.)

It’s a tale as old as tech, and folks like Clifford will enjoy their freedom of inconsequential speech until the end of the human story.

It’s a case of good old fashioned naysaying, and it’s currently one of the marketing world’s greatest roadblocks.

Doom and Duh: The Case of AI vs. Marketing

There are two general categories of naysayers when it comes to AI and marketing.

First are the doomsayers who believe that AI will be the death of creativity in marketing. To their understanding, machine learning is less of a decision-making tool and more of a robotic overlord —literally, a tool that makes the decisions.

The workflow goes:

  1. Machine says jump.
  2. Hapless marketer jumps.
  3. Profit.

Second are those who won’t go as far as to say AI will be the death of marketing, but take a lot of care to remind us that AI should be used in moderation.

As if we needed telling. As if the power of machine learning might corrupt even the most virtuous marketer into forgetting that there’s an art to sales and marketing.

#Digicon2018 saw a lot of these types: visionaries who single-handedly prevented the decay of the marketing profession by reminding people that it takes emotion to sell a product. To their credit, it’s hard enough to become a Captain of Industry —but these people achieved the rank of Captain Obvious along the way.

As a result, the landscape for discussing the future of marketing is full of doom and duh. Half the room is shouting No hope! and the other half is begging to be told, No shit!

For marketers, this all poses a major problem: the idea that firms should take it slow and approach AI with caution.

Working in a country where SEO and data-driven marketing have only just begun to gain traction, I can say definitively that caution is the least of our problems.

We aren’t flying too close to the sun —on the contrary, we’ve barely taken off. If you look at the global landscape, we’re falling far behind other countries in terms of what we can (and should) accomplish. While comparable businesses in technologically progressive countries are securing their advantages for 2020, we’re stuck debating the problems of 2010.

Overpopulation on Mars

Don’t get me wrong, it pays to be cautious when developing new technology. There are ethical concerns and externalities to consider, but that’s all far beyond the scope of the “Creativity vs AI” debate.

The thought leaders worrying about AI undermining creativity need to take a look at yesterday’s news: creativity is expected to be in high demand in 2020 (right behind critical thinking and complex problem solving), as per the World Economic Forum.

This can’t be right. The algorithm said to use our feelings.

The voices of professional reason lecturing us on the importance of creativity in the face of data should take a page from any of the dozens of disruptors who’ve made compelling –and yes, creative– ads and products through the strategic use of data.

Put simply, the rest of the world is so far past the problems dreamed up by today’s Clifford Stolls that the hesitation to upskill is doing us more harm than good.

Our legacy industries are suffering from a wide competitive gap, and it’s only a matter of time before stubborn marketing agencies lose business to younger, more agile firms who know how to get with the times.

It’s a sad story, told into being by people who are either afraid of change, or comfortable being the biggest fish in a stagnating pond.

I’d say that the debate between creativity and AI is moot and academic, but that would be insulting to academics. They’re worrying about overpopulation on Mars when we’ve barely made it past the moon, so to speak. The ultimate irony is, the creatives most afraid of being left behind by AI are those who lack the creativity to use it.

Failure to launch?

Try failure to think.

***

If you think learning is a better use of time than shouting at imaginary problems, why not sign up for our Masterclass? We’ll teach you how to solve problems instead of complaining about them.

Building Better Analytics Teams

What makes a good analytics team?

I have had a number of opportunities to build and manage teams of domain experts in my career, and true to form — I have noticed some common patterns.

Rather than a “how-to”, I instead want to highlight the traits and scenarios I have seen make some teams succeed better than others.

The Better Teams Define Their Success Parameters

A touchy subject and oftentimes a source of criticism is the purpose of the analytics team. I have built teams within corporations and a common expression is: “avoid science experiments, give us actionable results.” Although the advice seems like sound “corporate-speak” — what I found is the pendulum actually swings in both directions. You actually need an intersection of at least 4 things:

  • Analytic purpose: Is it targeted business insights, or data-driven discovery?
  • Is the team’s composition and objectives aligned to the analytic purpose?
  • Business maturity curve: what is the most appropriate capability your company requires at the moment?
  • Is the team able to execute on its mandate?

The problem arises when one or more of these areas are out of sync with the rest, which results in sloppy results, lack of focus, and worse: disillusionment of management on the value of analytics and a backslide back to gut-feel decision-making.

Define The Box, To Look Out Of The Box

In some cases you will encounter teams that segment themselves by analytical process, or segmenting analytics teams into procedures: e.g. the data quality team, the reporting team, the analysis team, the forecasting team, the modeling team, etc.

Analytics is actually the aggregation of all these processes. While there is occasional argument for and against this “commoditized” approach, the more successful teams are able to holistically integrate various parts of the analytical process to drive outcomes, regardless of their place in the analytical process. In some cases it does help to have distinct teams handling different processes, but each team should be able to contribute and interact with other processes: i.e. the “reporters” can help drive analysis, the “analysts” help police data quality, the “forecasters” refer to reporting for drivers of their models, etc.

Some of the most interesting expressions I have heard uttered in the workplace are multiple versions of: “I don’t write commentary, I just produce reports” and “I just comment on the numbers, go ask the reporting team why the number looks funny.”

Quite tragic.

The Better Teams Hire Character, Then Train Skill

We are probably at the height of the trend (or hype) in “analytical occupations”. Roles like Data Scientist, Data Engineer, Chief Data Officer and Chief Digital Officer are now the hot jobs under recruitment and most job descriptions for these roles put a high emphasis on technical competencies and skillsets required from candidates. After going through repeated recruitment, hundreds of CVs and interviews, in my experience, the successful analysts are not so much a function of skill but of their attitudes and values.

A baseline of technical capacity is definitely required to handle the analytical tools and make sense of data trends and patterns — but the stellar analysts also possess the drive and ambition to see projects through, the courage to point out anomalies or insights, and the initiative to take ownership over their data (warts and all) to drive business outcomes. Analysts who are naturally collaborative — and willing to share effort and credit on tasks also end up achieving more than the solo jockeys, regardless of brilliance.

There’s also the self-deprecating attitude — despite the body of knowledge they will usually possess, all the high-performing analysts I’ve met are humble enough to admit they don’t know everything — and this drives them to a path of curiosity and lifelong learning. All of them are voracious consumers of information, new trends, technology, and happily interact with other analysts to swap techniques. Contrast this with the occasional “big-wig” complex of some experts and academics who let ego get in their way — some of whom I’ve seen crash and burn miserably in the workplace.

Executive Leader
Leaders create an encouraging but intense environment that requires people’s best work

An Expert Leader Is First And Foremost A Leader

A follow-up point to hiring is the characteristic of a good analytical leader — and the typical assumption is that the big data boss needs to be the smartest, most-academically-decorated genius in the room. Technical skill and academic qualification tend to help with tackling analytical scenarios, but successful analytical leaders are still distinguished by their soft skills. The best characteristics (including those I have learned myself, the hard way) I think can be summarized by the Multipliers framework by the Wiseman Group — because they are true for analytics leaders:

Good Analytical Leaders (Multipliers):

  • Attract talented people and maximize them
  • Create an encouraging but intense environment that requires people’s best work
  • Define success and areas for people to stretch themselves
  • Drive sound decisions through rigorous debate
  • Give people ownership of results and invests in their development

Bad Analytical Leaders (Diminishers):

  • Hoard resources and underutilize and lose talent to boredom
  • Suppress people’s creativity and thinking
  • Must always be the smartest person in the room
  • Make centralized decisions and disempowers
  • Micromanage everyone

The Better Teams Encourage the Qualitative As Much As the Quantitative

Analytics roles generally attract individuals from quantitative disciplines — computer sciences, economics, statistics, mathematics — and generally the ability to crunch numbers and formulae will not be in short supply. However, especially in a work environment, analysts will not just be called upon to process data, but also to present this in meaningful form to peers and management. This requires a set of disciplines closer to the arts and humanities: public speaking, 2D and 3D visualization, colors and representations, and written communication — all of which will usually come counter-intuitive to the quants.

In forming analytics teams, a healthy balance between science and humanities (reminiscent of the Apple philosophy) is needed — through deliberate recruitment of mixed backgrounds and complimentary training to fill out the gaps for each analyst. The successful teams I’ve handled had good mixes of Statistics and Accounting, Mathematics and Finance, Economics, Business Management, and Computer Science.

The Better Teams Never Let Technology Define Capability

Depending on where they were minted, many analysts tend to be rabidly loyal to a particular tool, or software, or framework. We have all seen the nth iteration of the “Is R better than SAS or SPSS” or “SQL vs. NoSQL” or “RDBMS vs. Hadoop” debates but the reality I have seen is technology is rarely a differentiator of results. Granted some forms of analysis are easier to perform on certain platforms than others — the results come from teams that are able to formulate a business question or problem in terms that technology (whatever available) can interpret, process, and consequently resolve.

When I moved from being a business user into IT consulting, I was appalled at the number of instances teams or companies I encountered acquired tools they rarely use, or use incorrectly — and since most commercially available analytical software and hardware do not come cheap, this can become an invisible sinkhole of profits for many organizations. In these cases it is the analytics teams more than IT that should drive the applicability of their current or desired technology toolkit to their business environment.

Analytics Maturity
Executive commitment to analytics drives maturity
Source: Davenport and Harris

The Better Teams Recognize And Manage The Politics Of Information

Over the years I remained in contact with many prior colleagues who worked in the teams I’ve set up and am always humbled to hear about their present successes and how they traced much of their learning to those formative years spent together working with me. What I’ve found is the key to longevity in an analytics career is the deceptively simple recognition of the saying: information is power.

Careers, companies, and strategies have been built and destroyed on the back of information. Oddly, most analytical teams prefer to be “apolitical” or “amoral” about information — they will tend to dwell on the “how” rather than the “why”. I think this is both a costly mistake and an opportunity loss. Analytics has the ability to transform organizations, and analytics teams should take ownership of that power and use it, but be conscious of the massive power on their hands.

I am not suggesting analytics teams become active and malicious power brokers (some do) but going back to the first trait above: in defining success parameters become a trusted advisor to all parties in the organization.

Final Thought: The Best Analytical Teams Cease To Exist

According to Davenport and Harris’ analytical maturity stages: true maturity in analytics happens when the analytical functions no longer exist as a centralized capability, but become a natural part of each organizational function. This implies that a truly successful analytical team actually ceases to exist independently — and effectively each team in the organization becomes an analytical team.

Or to put it another way, the truly successful teams in any organization are the ones that become truly analytical.

I like the sound of that.