Tag Archives: big data

Hacking Critical Infrastructure Explained: Common Attacks and Good Defense

Cyber security attacks on industrial control systems have increased in number and complexity in the last few years. Attacks like Stuxnet and Night Dragon have proven that cyber attacks can have significant negative business impacts and potentially even loss of life.

shipp1At the upcoming Cyber Security for Oil & Gas Canada event, Chris Shipp, the Chief Security Information Officer for Fluor Federal Petroleum Operation, will provide an overview of current successfully cyber attacks and threats, followed by a live hacking demonstration of a control system and conclude with a case study demonstrating important security components in control systems. This interview provides a sneak peek of what’s to come at the event …

* The thoughts expressed here are solely the opinions of Chris Shipp and not necessarily that of the Strategic Petroleum Reserve.

HH: Walk us through the journey that led to you being known as the utmost “hacking” expert in the industry.

CS: As Chief Security Information Officer for Fluor Federal Petroleum Operation, I function as the chief cyber security representative at the U.S. Department of Energy’s Strategic Petroleum Reserve for about 14 years.

When I started in October 2000, I was the only person allocated to cyber security. I first got into cyber security via a rather circuitous route because there wasn’t a direct route to become proficient in cyber security. As part of my earlier job history, I worked at a professional services company where we provided maintenance and professionals services related to IT systems, implementation of networks and applications and so on for various commercial and federal customers. I have also taught some at Tulane University and during those years, one of the things that I noticed with respect to teaching the curriculum I was looking at and with the systems I was implementing, was that there wasn’t a lot of information directly about cyber security. It began to concern me. I had a few customers who were very forward thinking who asked about security and how their data was protected and so I began to do a lot of my own research and in that research I became something of a cyber security expert. I’m not really comfortable with that term because I’m still learning and it’s an ever-evolving process, but I became very proficient in cyber security, which is something not a lot of other people at that time were able to achieve simply because they weren’t a lot of avenues or requests for it.

I began to incorporate more security information into the curriculum I was teaching at the local university as well as talking more with my customers about things that we could do to secure their data. Through these conversations, I was put into contact with the Department of Energy because they were looking for a Chief Cyber Security guy. I’ve had the wonderful opportunity to work through a cybersecurity program that really was in its infancy, its adolescence, as most programs were back then, since there weren’t as many federal regulations or they were just starting to be implemented. We were able to build that program all the way through and build a risk management program for both classified and unclassified systems; systems that have to do with standard business operations and control systems.

Although you don’t like to call yourself an expert and instead refer to yourself as a student of your industry, I am very appreciative that you have knowledge of both sides of the coin since you have knowledge of both government and industry cyber security practices. Would you provide a brief overview of the cybersecurity discrepancies between government and business in terms of how they are able to prevent, protect and manage security threats? What can government agencies learn from industry and vice versa?

First let me say that I would divide industry into two separate camps. One would be commercial entities that don’t have a lot of federal or local regulatory requirements specifically related to cyber security yet. And then the other camp would be those that do like NERC CIP, which is energy regulation with respect to cyber security, HIPAA has some cyber security elements related to health information or the Gramm-Leach Bliley Act, which applies to financial entities. So, those entities have some very strict regulatory environments that correspond pretty well to typical federal government agency requirements.

I would put other commercial entities that don’t necessarily have that regulatory requirement yet in their own separate camp. With respect to the commercial entities that do have regulatory requirements and typical federal government entities I would group them into one group.

There are two common issues that they have. The first and foremost would be that cyber security personnel typically do not have a strong business background and so they do not have the capability or the proper business knowledge to present cyber security proposals in a way business decision makers understand and approve of. For example, if I were to tell a decision maker, someone who holds control of a particular budget, “Hey I need a new intrusion detection system because it does a better job of detecting and stopping the bad guys, what does that really mean from a business perspective? I haven’t expressed that in terms where a person who is business minded would understand.

Conversely, I would be more successful if I had said, “Look, we have this very important system, and well all agree it is important, but we have seen within the last three months increasingly complex attacks against that system that are coming closer and closer to being successful and based on that we predict with pretty reasonable probability that that system will be successfully hacked within the next 90-120 days if we don’t introduce this particular intrusion detection system will cost you $100,000, if that system were hacked and down for a day, it would cost $1.2 million.”

That is something a business decision maker could understand and what I find when I talk to many different people in the cyber security realm, often you will find somebody who was initially very technically astute, brilliant people and they end up running a cyber security program. They’re very good at what they do, but they don’t necessarily have the business acumen to express the need in that way.

That’s a good point, and let’s jump in to the hiring process of these operators. As the saying goes, a business is only as good as its workers, especially for oil and gas companies. What best practices could you provide for the hiring and maintenance of cyber security staff so that they are fully prepared to prevent and address threats?

That’s a great question, in fact, of the two main issues that I find in organizations with cyber security is that they simply don’t have people with technical know-how at the technical level to properly implement and maintain cyber security. It’s a difficult problem to solve because cyber security is a relatively new discipline; it’s not as mature, so therefore there are not as many people who have the necessary skill set.

Cyber security is a skill set that would be akin to a specialty in the medical field. For example, you would never try to train somebody to be a cardiologist before they became a MD. First, you learn how to become a good doctor and then you use that as a baseline if you go to finishing school to learn how to be a good cardiologist. The same thing is true with cybersecurity. You can’t learn cybersecurity in a vacuum. A good cybersecurity technical person, for instance, has to build their security knowledge on some other discipline that they have strong knowledge in. For example, they may be a person who’s worked with network infrastructure switches and so they have a very good working knowledge of those systems and can take that knowledge and build upon it to make good systems for cybersecurity in those environments. Perhaps they are an application developer. They can use that knowledge to build upon and become very good at application security. So, one difficulty is that it’s not a discipline where you can take the smartest person from scratch and teach them fairly easily.

The other difficulty is many colleges do not provide good cybersecurity programs. Often colleges are teaching somewhat antiquated computer science. Not always, but that’s often the case because computer science changes frequently unlike some other disciplines. The good news is that the National Security Agency (NSA) has established something called the National Centers on Academic Excellence. Colleges can apply for this program, which incorporates cyber security elements into their curriculum, as defined by the NSA. If a student goes through the program and learns those elements then they have a good working knowledge of cyber security in which to build. Personally, I’ve been able to leverage that avenue to hire several top notch cyber security people that didn’t require a tremendous amount of education and training to bring them up to speed.

Will you share with us the risk management strategy for the cyber security program at the Department of Energy Strategic Petroleum Reserve? How has vulnerability management, continuous monitoring and incident response evolved during your tenure and where do you see areas for growth?

I want to make sure I don’t share information that is sensitive, so I will talk about the program that is provided by NIST, which we make heavy use of with respect o risk management.

NIST 800-37 is a wonderful document, and let me point out, too, that NIST standards are paid for and developed by the government and available freely to anyone. There’s a six-step process involved in what the NIST calls their risk management framework. It starts with a business process, which is categorizing information systems. For example, in the oil and gas industry you may be talking about a business system or a control system. What does that control system really do for you and how important is it to your mission? That is a business determination not an IT or cybersecurity determination, so business decision makers must be involved in that process. Based on the determination, you follow a process where you select the appropriate security controls to apply to that system. You implement those controls and then once they’re set up you continually assess their value and monitor them. It’s a circular process.

Sometimes there’s the misperception by those outside the IT industry that once you develop a system and put it into place it’s fairly static. Nothing could be further from the truth. Even if you have no new projects and just say, ‘we’re running as is,’ you have many, many software and application updates that will change as the risks and updates come out. It’s a continual process to determine how important this system is to our business and therefore we determine how much time, money and effort we’re going to spend to protect it. That informs the selection of the appropriate security controls, how they’re implemented in the environment and how well they continually operate. That’s the process of boots-on-the-ground implementation and the procedural and technical goals we use for cyber security.

The next question would be how to maintain the system from the perspective of the business. It’s done by storytelling. When I tell people this, they laugh at me because they think it means I’m telling something that’s an untruth, but what I mean is when you hire someone who has a strong background and understanding of cybersecurity and they also have some business acumen then they need to tell a true and accurate story to business decision makers: “This is the risk to the system, this is how we’re mitigating that risk and this is the residual risk.”

I know I’ve emphasized this before, but I see the redundancy again and again. Business decision makers are in business to make money or in the case of a federal government entity, they have a tight budget, so you have to explain to them the benefit of that spend. You have to tell a story in business terms that helps them understand why they should spend the additional dollars, why they should hire the additional personnel or allocate the additional resources to do the additional tasks you’re defining.

Science. Art. Design. New York Times’ Jer Thorp on Data Analytics

DATAScience. Art. Design.

These are the three elements that are the foundation to exciting data visualization. Design isn’t just about polishing up data. Look no further than Apple, Inc., to see how foundational design can be in an organization.

On the other hand, data scientists must also align analysis with art. Artists ask the important questions business executives may not have considered. Plus, data artists excel at breaking down boundaries in creativity. No one understands this better than Jer Thorp, Co-Founder of the Office of Creative Research and Former Data Artist in Residence at The New York Times.

HH: What does it mean to be a data artist?

JR: There are two rough reasons why people call themselves a data artist. One of those reasons is people who are doing things kind of outside what we think of when we think of data science or when we think of statistics and maybe combining design to do data visualization and then there’s the second group of people who are working with the data as part of a larger art practice. Myself, I think I straddle both sides of that border. So, I run the office called the Office of Data Research where we do a lot of data focused R&D work for companies like Microsoft, Samsung and Intel. We do a lot of design work. We do a lot of data visualization tools to try to solve really weird and interesting data problems and then the other side of it is we also have an art practice where we build often physical, sculptures that can be built in physical spaces and museums and galleries. We just finished a long artist in residency at the Museum of Modern Art in Manhattan and we’re just starting a gigantic project to install a piece in the Boston Public Library. So, for us – and for me – the term data art is really wide-ranging.

I interviewed Jer Thorp, co-Founder of the Office of Creative Research and the former Data Artist-in-Residence at The New York Times
I interviewed Jer Thorp, co-Founder of the Office of Creative Research and the former Data Artist-in-Residence at The New York Times

Let’s talk about the “weird and interesting” part of data. How can the human side of data lead to innovation and effective change within an organization?

I use the term “humanizing data” a lot. In some ways it’s kind of a given. Data doesn’t exist without humans and if we think of data as the measurement of something, that act of measurement is by default a human measurement. We have machines that are doing the measurement as a proxy for us, but at the root of it, data is really a human thing. We’re producing it. I think where it really gets problematic is when we’re talking about data that is a measurement of humans. Even something as simple as location data or survey data that comes from customers or whatever the case it may be, there is something there that becomes ethically interesting and ethically complicated because we do need to consider the humans that are the systems from which that data is being generated. In a business sense, I think this is a challenge and an opportunity.

It’s a challenge because we can do things with this data very easily. There’s no permission form. I don’t even have to talk to these people. I can just use their data and away I go on my merry way. The problem with that is that if I cross a boundary that is either uncomfortable or negative in some way towards these people then I can breach their trust. That is a thing you definitely don’t want to be doing as a business because it can lead to a situation in which your consumer base loses trust in you. That’s why I also say data presents an opportunity because I think there are precious few companies right now who are seeing this as a chance to set themselves out from the crowd and say, ‘hey, unlike all these other organizations out there, we’re going to be fair with your data, we’re going to be transparent with your data and we’re going to use your data in a way that will make sure that you still trust us.’

I think we’ve been lucky in the last 85 years because we’ve been able to get away with a lot of things with consumer data without consumers being aware of it and now I think that’s changing. And, so these companies –hopefully there will be more and more who are putting the right foot forward and saying we want to be an ethical company – are going to have a tremendous advantage.

DATACould you provide an overview or example of the business application of creative data-focused research?

Research is the most important word. It really comes down to innovation at a true level. I fundamentally believe that you can’t have innovation without a certain amount of risk and without a certain amount of limitation. So, for lack of a better term, creative data exploration – or data art – provides an avenue to experiment and to try new things that otherwise wouldn’t be tried in the everyday course of data analysis or data science.

I believe there’s huge value in trying things. I have this phrase I use when I describe our work, which I call “question farming.” In a lot of cases what we’re doing is confirming the suspicions that we already had or trying to find answers. We hear a lot about using data to find answers. But, I think it’s just as important to use data to find questions. There are a lot of questions that we don’t even know how to ask yet. It’s those questions that lead to true innovation, when you get to say, “hey, we never thought about it this way, but what would happen if we try this new thing?” and it’s the nature of those questions that are not just available sitting in your bathtub. You need to try new things and you need to prototype and you need to take risks.

It’s not a particular surprise that if we look at the history of successful companies over the last 100 years, largely those are companies with strong research and development groups and I think that there are a lot of companies who we talk to who have trepidation about investing in R&D because there’s so much risk involved. But, you have to take a chance that it isn’t going to work. You have to be willing to understand it’s a risky investment.

Many analysts have to prove the quality and integrity of their data to their high level executives. But, they’re working in such a siloed structure that it’s kind of a struggle to present the data in a meaningful way that’s readable, but not based solely on the human factor.

What are your tips for bridging that gap and secure executive buy-in?

First of all, data and integrity go together hand-in-hand. With the data that we work with, even though it skews to a more creative axis, data integrity is fundamental to our work. We are always very careful to make sure that the data that we’re using is sound; and that we’re representing it in the right way, that we’re aware of its biases, that we’re aware of its errors, that we’re aware of its missing data and so on, and so on. Part of the answer to the question is to be honest about those types of things. There’s no such thing as perfect data. One of the things that I always found that instills a little bit of trust in data visualization or a data presentation is if there’s some honesty about those types of issues.

>> Read more on Jer’s work

I  have a principle I sit on when I’m doing data visualization, which I call the “Ooo-Ahh” principle, which means that a good data visualization should do two things at the same time: The first thing it should do is capture people’s attention. That’s like the “Ooo” moment. The second thing it should do is teach somebody something, which is the “Ahh” moment. When you’re presenting data to the CEO or whomever your stakeholder is, there’s a balance that has to be achieved because you want to invest enough in the “Ooo” that they’re not just going to skip over the figure or the chart that is the “Ahh” moment. You want to show them something that’s going to be engaging. But, you can’t do that in sacrifice of “Ahh.”

The reason why there tends to be conflict between data and aesthetic is that the mistake is to sacrifice clarity for aesthetic. That doesn’t have to happen. You can have your cake and eat it, too. So, we can have a data visualization that carries all the information that we want, but it adds some visual flavor and design treatment, which makes it so that it’s more memorable, it’s more attractive, it’s more readable.

"I use the term 'humanizing data' a lot. In some ways it’s kind of a given. Data doesn’t exist without humans and if we think of data as the measurement of something, that act of measurement is by default a human measurement. We have machines that are doing the measurement as a proxy for us, but at the root of it, data is really a human thing. " - Jer Thorp
“I use the term ‘humanizing data’ a lot. In some ways it’s kind of a given. Data doesn’t exist without humans and if we think of data as the measurement of something, that act of measurement is by default a human measurement. We have machines that are doing the measurement as a proxy for us, but at the root of it, data is really a human thing. ” – Jer Thorp

In previous talks, you talk about the lack of dialogue between three elements of data; Science, Art and Design. Can you tell us a little bit about how organizations can address this issue?

Multidisciplinary data is something I truly believe in. Envision those three circles in the Venn diagram – the more overlap there are the more productive the result is going to be. Starting with the boundary between data science and design: I was having a conversation with a large organization the other day who was talking about how great it was that they just finished a project, it was all working, and then they brought it to their design team and they were amazed by what the design team was able to do in a couple of hours with this thing to make it better. I turned around to them and said, “If you think that’s good, imagine if you had decided to work with the design team from Day One.” What if this project had design as one of its elements? You probably would have ended up with something of inordinate magnitude than you actually did. I think one of the misconceptions that people from the data world have about design is that design is just about making things look good. But, design is a lot more than that. Design is a way of thinking and by bringing designers into the process early is that the results will get better and better and better. We just have to look at Apple to understand how foundational design can be in an organization. That’s one of the reasons why they’re so successful is that design is baked into the organization from the ground up.

On the other end of the Venn diagram is the art. I’ve been a huge advocate of recommending companies do artist residencies, which I think are such an incredible opportunity for everybody involved. Bring an artist in for six months, set them up at a desk and they will come in and work with your data and your employees and make something incredible. One of the things that the artists are really good at is asking questions that you may not have thought of and they’re all so good at breaking down boundaries in creativity.

This is a thing that has a deep history and it has a deep history that really works and it’s not just something to do for fun. I’ve been spending a lot of time over the past year with this woman named Lillian Schwartz. She was the artist in residence at Bell Labs for 35 years between 1960 and the early 1990s and a lot of the transformative things that came out of Bell Labs at the time were little pieces that Lillian had assisted in, starting with answering questions and also doing some real work on these projects as well.

Jer, it’s very clear that you love what you do, or at least you very much enjoy it. But, do you ever struggle or suffer from data fatigue? What kind of tips would you provide analysts who are struggling with this and maybe aren’t able to discover new questions or answers to the data that they’re processing and reading?

There are a couple of answers to this question. For individuals who are working with data and for executives who are working with teams of people working with data is to find ways to continually make the data fun. What it might be about is that every second Friday of the month there’s a Hack Day, in which you bring in a totally new data set that maybe isn’t related to a current project. It could be one of your teammate’s location data. It could be something pulled from the Internet or your email history from the last two years. The idea is to get into data to exercise their muscles that have been lying dormant for a little bit.

At the center, we’re always trying to find these small tasks and sets of data to use as small experiments to get our minds off the “real job.” That has been really effective for us. It’s important to remember that data analysis isn’t all about sitting in front of your computer. Any data set has a lot of real grounding in the real world. There are resources you can read and things that you can watch to be more informed of where the data came from and that will allow you to do a better job of visualizing it or analyzing it or whatever it is you might be doing.

I should take a moment to shout out to one of my best friends who runs DataKind, which tries to take really great scientists and pair them with non-governmental organizations that have data problems. DataKind will plan a Hack Day where they’ll bring in data – maybe it’s from a cancer organization or a company that’s doing irrigation work in Africa and is need of data scientists to read the data for them, but they don’t have the money to pay them. So, it’s a nice way to tackle a really hard problem, promote team building and also do some good.

Managing Outbound Control on the U.S. & Mexican Border

The illegal exportation of weapons, ammo, technology and people from the U.S. is discussed in this interview with John Woods, Assistant Director at U.S. Immigration & Customs Enforcement. We also examine the limited controls in place and how it limits effectiveness. He also investigates the infrastructure in place for inbound peoples and goods and how outbound exports are managed without them.

What new technology and surveillance equipment developed for overseas conflict can be used to enforce border security at home? (ex., the automated tracking device initially meant to find roadside bombs), which can now be used to track down illegal border crossers?

We in investigations at Border Security use a tracking device initially meant for roadside bombs to identify illegal border crosses. It’s good for organizations like CBP in identifying and securing the border that way.

We in HSI look at the border a little differently. We look at it as an investigation. We look at the vulnerabilities at the border and establish and identify those transnational criminal organizations that use the border as a way to illicitly move their goods.

That being said, we look at technology such as the control of Big Data and how we can utilize it. Looking at declarations and inventories of things that are believed to be in the country and being able to look for anomalies in that would identify either packages or freight or some sort of trend that would identify illicit movement of goods or strategic technologies, so that we can identify those people and then target them for investigation or for outbound inspection. We look at the equipment or new technology a little differently. We look at more the examining of the Big Data.

Could you provide an example in which this was successful?

Take for example using, combining and putting in data from multiple databases into one analytical support program and then using and dumping algorithms that would look for the anomalies that we would identify for targeting. Another example would be identifying several packages that were being shipped under a false company out of Miami going to South America. We determined that they carried weapons in them and were able to stop the flow of the weapons through this process.

Because with the volume of commerce that goes in and out of the United States, we don’t have the resources to open every container and express package. So you have to be able to find out which ones that you want to target and then target them successfully.

Jumping ahead a little bit, what would be the weakest link currently in outbound control, since that’s your expertise, in the US?

Right now the problem in outbound control is mostly with the people at the land border. We recently went into an agreement with Canada at the land border where we have their entry system as our exit system. So as they identify people and bring them into Canada, we share that data with them so we can identify that the individual has left the United States. Unfortunately, the Mexican border control is not set up in a similar fashion to Canada or ourselves, so it’s not logical that we can use that land border data as an exit system.

So it’s very difficult right now, and the weakest link is probably trying to identify people that leave the United States so we can determine that they’ve left on time, and we’re not looking for people that maybe have overstayed their visa but have left at the land border of Mexico.

Is there a way that you could use the relationship that you have in Canada, in Mexico? Is that something that you’re looking at?

Yeah, unfortunately not because of the way the Mexican immigration is set up. Their checkpoints are further inland than in Canada, so the reliability of the data they collect wouldn’t be good. So they would have to build an infrastructure that would cost them lots and lots of money, to establish the same way we have an infrastructure at its ports of entry.

That being said, we do look at other technologies like license plate readers. The CBP is developing technologies that can be used at both the airport environment and the land border environment to identify people who leave the United States.

Border security, especially in Mexico, is a politically hot-button issue. So how have you been able to navigate the politics of what you do? Or is that not something that you face day-to-day?

Well it’s something that I would face day-to-day, because I’m here in Washington. So I have political ramifications of issues. You know, I have to go before congress and discuss the issues. And you’re right; it is a hot-button issue.

I’ve been in this game for 27 years, and it’s been a very political issue for all 27 years. I came in when they first established the Immigration Reform and Control Act of 1986, and they were going to stop the flow of illegal aliens into the country by getting rid of the magnet, which was employment. We were going to have employers verify people. Has that stopped the flow? No. Have other enactments such as terrorism acts, stopped the flow of aliens? No. Because the magnet is still here, this is still the best country in the world, and a lot of people want to come here and live here and make their lives better.

That being said, we did take an oath to enforce the laws, and one of the laws is that you should not come here illegally. So it’s based on the border patrol and our investigative abilities. That’s where our best bang for our buck is; to go after the smuggling organizations that facilitate the illegal alien entries and stop the flow that way. We use various technologies such as a metal chain-link fence or an electronic fence that has sensors in the ground to identify those illegal crosses and better use our resources to apprehend them and stop them from entering the United States.

So what is the biggest threat to U.S.? Is it economic or loss of intellectual property by the illegal exportation of our technology?

That, to me, is a big threat. I mean, I oversee the export enforcement role here in HSI, and I feel that our strategic technologies either being a.), stolen, or b.), just purchased and illegally exported without license, is a very huge threat to our national security.

We have advanced technologies that make us a great nation, that protect us from our enemies, and by allowing any of those materials to fall into enemy hands defeats our ability to have the upper hand. So we need to protect and ensure that those technologies that are licensable and eligible for export only go to the right hands, which would be our friends and people that we want to trade with. We want to make sure that those items also fall in the right hands and are not used against us. So it is a huge threat.

This article was originally published on

The ROI of Big Data for Marketers

Chief Marketing Officers know the benefits of Big Data. Oftentimes what they don’t know is how to use it. David Rogers and Don Sexton at the Columbia Business School wanted to gain a better understanding of the changing practices among large corporate marketers. What they found was support for the use of new data to drive marketing decisions and measuring ROI and a widespread adoption of new digital tools.

Still, significant gaps exist between conception and execution when it comes to Big Data Marketing efforts and there remains a need to improve on the use of data, the measurement of digital marketing and the assessment of ROI.

Successful brands use customer data to drive marketing decisions, 91% of senior corporate marketers

Yet, 39% say their own company’s data is collected too infrequently or not in true real-time

A lack of sharing customer data within their own organization is a barrier to effectively measuring marketing ROI, according to 51% of respondents

Around 85% of large corporations maintain brand accounts on social networks such as Facebook, Twitter, Google+ and Foursquare

Comparison of the effectiveness of marketing across different digital media is “a major challenge” for 65% of marketers

Financial outcomes where omitted by 37% of respondents when asked to define what “marketing ROI” meant for their own organization

57% of respondents are not basing their marketing budgets on any ROI analysis

Brand awareness is the sole measure to evaluate marketing spend for 22% of marketers

Source: Marketing ROI in the Era of Big Data: The 2012 BRITE/NYAMA Marketing Transition Study

The 5 Most Popular Types of B2B Content Marketing

Content marketing has elevated beyond a cottage industry and into a necessary marketing tool. As content marketing gains momentum, it’s now more important than ever to get back to the basics of developing a successful content strategy surrounding remarkable content.

Business-to-business marketers are perhaps the most fluent content marketers and content strategists. “On average, B2B marketers employ eight different content marketing tactics to achieve their marketing goals,” according to Eloqua and the Content Marketing Institute.

What are the most popular? We’ve detailed five of them below.


Thinking like a publisher is huge in content marketing and blogs are the basis to this theory. “It’s where new content gets distributed, conversations are hosted, context for news is provided and personal brands are born,” according to Eloqua.


eNewsletters are still a popular mode of distributing content and should be a part of your strategy as a whole. At the top of the list of tips for running successful eNewsletter distribution is: Do not spam. eNewsletters are “a permission-based means of recurring communication with current and prospective customers.” Always ask for permission to email participants and always offer opt-out links.


Whitepapers have been around since the dawn of content. They are topical, lengthy reports that address technical issues or subjects that require intensive explanations. Even though they are the “grandfather” of content, they are still essential to establishing yourself as a thought leader. Just remember to send whitepapers in PDF format and consider including a lead capture form.


Smartphones and cheap video cameras coupled with YouTube and Vimeo help content marketers create, publish and share videos more easily than ever. Just remember to not film “talking heads.” Consider a long-term video series and not just a one-shot approach. When you do post your video, always consider posting an accompanying transcription or report, CMI suggests.


CMI says infographics are a “visual storytelling told through data” and that they “rise above the noise to deliver data in a visually appealing way.” A good tip is: good data=good infographic. Once you’ve completed your infographic, develop a marketing plan around distributing it.

The Top 5 Reasons Big Data Is Valuable to Your Business

Have you started thinking about how your company will value and leverage your big data assets? If not, it’s time to play some catch up.

Cross industry businesses have welcomed big data analytics with open arms after seeing its benefits first hand. As proof, the McKinsey Global Institute delves deep into the benefits of big data in their report, “Big data: The next frontier for innovation, competition and productivity.” What they found were five, actionable reasons businesses need to jump into the practice with both feet. Here is what they determined:

1. Big Data Brings Improved Business Models, Products and Services

What’s with the flurry of excitement that accompanies each new generation of the iPad? The folks at Apple are pros at understanding what their customer needs – sometimes even before they do. Manufacturers now use data captured when consumers use their products to improve upon their existing offerings, thereby creating new and improved models that benefit the consumer and push them to buy.

2. Putting A Smile On the Face of Your Stakeholders

Improving transparency leads to improved quality of product and service. Big data can be made readily available to relevant stakeholders, which creates value by reducing search and processing time between departments, according to McKinsey. Big data keeps everyone in your department moving in the same direction.

3. Peek Into Personnel Performance

Upper management will be empowered by the collection of more accurate and detailed personnel performance data that can be reported in real or near real time. Find out instantly your company’s turnover rate or its total number of personnel sick days, according to the report, to try to understand the root causes of certain performance-based issues.

4. Customize Your Customer Experience

You’ve been segmenting your customers for years, but now it’s time to microsegment them. Big data empowers organizations to tailor their products and services to meet the very specific needs of each customer. An example the report gives is tailoring applications on a smartphone based on the owner’s personality.

5. Find the Algorithm Groove

According to McKinsey Global Institute, “Sophisticated analytics can substantially improve decision making, minimize risks, and unearth valuable insights that would otherwise remain hidden.” They site the following examples; tax agencies can flag candidates for further examination or retailers can use algorithms to fine-tune inventories or pricing structures.

Now is the time to jump on the big data bandwagon.