The State of Minnesota Marketing: Insights from 6 Minnesota Brands

State Minnesota Marketing

Minnesota is home to numerous nationally and internationally known brands, ranging from household consumer names like Target and Best Buy to giants like Cargill and United Healthcare. Given their broad focus, their home state isn’t a singular marketing focus for these big companies. But what about brands where the state brand plays a part of the company brand? What impact does that have on marketing?

That was the topic of a brand panel at the recent Minnesota Marketing Summit in Minneapolis.

Moderated by Nicole Shannon, Executive Director, Advertising for Star Tribune, the session opened up to a standing room only crowd. Panelists were on-hand from Minnesota brands including Explore Minnesota, Minnesota State, Sunrise Banks, Sun Country, Children’s Minnesota and the Minneapolis Downtown Council to discuss: What is the Minnesota brand and how does it relate to marketers and brands of companies operating here?

Takeaways from the panel about the relationship between “the Minnesota brand” and Minnesota brands included everything from customer targeting to  balancing in-state and out of state advertising. Of course, there was also an emphasis on Minnesota pride. Here are six nuggets of “Minnewisdom” that could be useful for you whether you’re marketing in the “Bold North” or in your local state.

#1 – Sometimes harder is better.

Leann Kispert, Director of Brand Marketing for Explore Minnesota Tourism, said that 70% of Explore Minnesota paid media has to go outside of the state of Minnesota. With that advertising, they have to deal with outside perceptions of Minnesota and it can be harder to convert visitors. But they spend more money and often become brand advocates.

#2 – Creativity + Unified Message = Win.

Noelle Hawton, Chief Marketing and Communications Officer for Minnesota State, shared that the vast majority of Minnesota audience or prospective students didn’t know what MnSCU was, the 37 state colleges and universities in Minnesota now named Minnesota State. To reach potential students, an illustrated poster map of the state of Minnesota highlighting features has resonated well by sharing information in an info-taining way. Also, by promoting a unified message on behalf of the individual schools, while also encouraging the schools to use that unified message in their own marketing, has helped create a more effective message.

#3 – Building Minnesota pride builds business.

Kelsey Dodson-Smith, Vice President of Marketing for Sun Country Airlines, declared that advertising is focused locally since that is where customers are. They also emphasized inclusive home state pride by commissioning a local artist, Mark Herman, to create custom illustrations for each plane that was named after Minnesota lakes as part of the #hometownlakesproject.

#4 – Build a great brand by doing good.

Becca Morris Hoeft, Chief Brand Officer for Sunrise Banks, talked about what it means as a business with B-Corp status to truly serve its customers. “As the urban core has changed, our brand has become more of a belief system, an opportunity to be more than a bank.”

#5 – Focusing on the why rather than the what.

Katie Sowieja, Director of Brand Strategy for Children’s Minnesota, offered a compelling explanation of Children’s focus on building connections based on beliefs and “the why” at the values level, rather than focusing on what the hospital does and how they do it. The “why” for Children’s are the kids they serve. This is why the name was changed from Children’s Hospital of Minnesota to Children’s Minnesota, which has also reinforced their mission to reimagine healthcare for “the most amazing people on earth.”

#6 – Help customers own their brand experience.

Leah Wong, Vice President of External Relations for the Minneapolis Downtown Council, talked about how their 60th anniversary served as an opportunity to evaluate the brand and value proposition. This resulted in a rebranded approach: “Your Downtown” as a place to participate in and also contribute to. The focus was to help people own their experiences downtown, helping the brand stay relevant and to help people feel empowered.

Minnesota Marketing Summit Audience

As I hinted to earlier, there was also a lot of Minnesota pride in this discussion, with observations like, “Minnesota is happiest state in the United States,” and the often cited claim that there are more theater seats in Minneapolis per capita than any U.S. city outside New York.

“Flyover country” is a challenging perception to get over and the Minnesota marketers recommended that we should be proud of the distinctions. Also, with the greater diversity of people living in Minnesota, brands are making more efforts to help people see themselves in the marketing Minnesota brands do.

And if you’re not one to embrace the cold of Minnesota, Kelsey Dodson-Smith had some advice: “If you don’t feel like embracing winter, Sun Country.”


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2017. |
The State of Minnesota Marketing: Insights from 6 Minnesota Brands | http://www.toprankblog.com

The State of Minnesota Marketing: Insights from 6 Minnesota Brands was posted via Internet Marketing

Advertisements

Digital Marketing News: Twitter Video Ads, Livestreaming Rise, RIP Eric Ward

Livestreaming

The Rise of Livestreaming: Why People Watch, and How Brands Can Benefit [Infographic]. Facebook users comment 10 times more on live videos than on regular videos. Find out what people are watching, what their preferences and behaviors are in this infographic from Koeppel Direct. MarketingProfs

The four habits of successful data-driven marketers. Econsultancy invited marketing experts to discuss what they do, the problems they face, and how they overcome obstacles which revealed 4 keys to data-driven marketing success from data management, to testing hypothesis and proper attribution models. Econsultancy

Twitter introduces a new video-centric ad format. The Video Website Card starts out as an auto-playing video with a customizable headline, which then opens up to a larger video and website preview, and ultimately directs viewers to the advertiser’s chosen website when they tap on it. Will this format take off? TechCrunch

Facebook Live cuts out the middle man, adds its own screen-sharing feature. Now this seems like a great feature for educational content. Facebook has added an option to share your screen directly on Facebook Live, eliminating the need for other software for many users. TheNextWeb

Top Brands by Customer Loyalty 2017

Fall 2017 Taking Stock With Teens report reveals favorite social networks. 47% of respondents say Snapchat is their favorite social network, and 24% say Instagram is their favorite. Guess which network only received 9% of the vote? Face who? MarketingProfs

Somehow, this is news. Snapchat is selling an $80 dancing hot dog costume on Amazon. The costume is based on Snapchat’s new celebrity character: the app’s dancing hot dog filter that quickly became an internet meme sensation over the summer. Business Insider

Eric Ward
The Search Community lost the Father Of Link Building, Eric Ward, aka Link Moses.
I met Eric Ward at my first Pubcon conference about 2004 or 2005, approaching him at a table to see if this SEO celebrity would be friendly to a nobody like me. Eric was the most generous, welcoming person I could have met and that openness is something that has stuck with me over the many years since. Eric was a really good guy and a true original when it came to search marketing and link building.

Like many in our industry, I learned a lot from Eric about low risk, high impact and high value link building and online PR. He used a photo I took of him in 2006 as his profile photo online and it always made me happy that he liked that image enough to use it. Eric will be missed and I wish the most heartfelt condolences to his family and friends. See the outpouring of commentary on Search Engine Roundtable

What was the top digital marketing news story for you this week?

Be sure to stay tuned until next week when we’ll be sharing all new marketing news stories. Also check out the full video summary with Tiffani and Josh on YouTube.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2017. |
Digital Marketing News: Twitter Video Ads, Livestreaming Rise, RIP Eric Ward | http://www.toprankblog.com

Digital Marketing News: Twitter Video Ads, Livestreaming Rise, RIP Eric Ward was posted via Internet Marketing

Want to Know About The Definitive Local SEO Ranking Factors?

Then come check out my session at SMX East 😉

On Tuesday, I’m going to go over what the data truly says about ranking in local pack results. Do you need to focus on link building?  Is there even such a thing as a “#1” ranking factor? What value do traditional local signals, like citations, still have the rapidly changing world of local search. I will also be teasing some data from the unreleased 2017 Local SEO Ranking Factors, so don’t miss out!

Also, if you aren’t SEO’d out yet, on Wednesday our Director of SEO the affable Ashley Berman-Hale will also be delivering an outstanding talk about how to kick ass at mobile SEO in real life.

And if you are still around on the last day, I’m going to be giving a link building clinic with the awesome Arsen Rabinovich of Top Hat Rank. If you are just looking to start out at link building, or want to ask some particular questions, this will be the place.

Be there, don’t be a square, and say high if you want to dork out more about SEO stuff and things.

Want to Know About The Definitive Local SEO Ranking Factors? was posted via Internet Marketing

3 Mouth-Watering Content Marketing Case Studies That Bring Home the Bacon

If you’ve ever been pregnant, lived with a woman who’s pregnant or even just been around a pregnant woman, you can guess it is not smart to lie to a woman who’s pregnant about food.

Well that’s how I felt today. 6 months along and in arrives a marketing email with the subject line: “[Infographic] Good Marketing starts with good snacks.” Yes, I understand, I’m not literally going to get any food out of this, but I expect to see some mouth-watering graphics upon opening the email. Nope.

Instead I found food references in the copy like “Are you giving your prospects nourishing snacks or asking them to bite off more than they can chew?” and “Make your content highly snackable”. Still, this email teased me enough to click on the CTA to the infographic: Surely within it food will reside!

The infographic was 100% unrelated to food. This is what I call an unfulfilled promise.

As content creators, it’s our job to catch our audience’s attention. Check, done. But it is also our jobs to pay off what we’ve promised the audience within our content.

So, today, my promise – like my headline, title tag and meta description state – is to fill your senses with mouth-watering case studies of money-making campaigns. In following best practices, like delivering on a promise, this content has been able to drive outstanding results, bringing home the bacon for brands. Oh, and I might include some tasty food pics. I mean, “mouth-watering” and “bacon” are in my headline.

Paid-First Digital Marketing Strategy Drove Impressive ROI in Month One

The Strategy:

A new client came to TopRank Marketing recently craving customers – FAST. Sound familiar? But seriously, this B2B startup needed to see ROI as the first course – not dessert – in order to be able to keep investing. In addition, they were looking for support in SEO, developing landing pages in the short-term and gathering the insights needed to create a long-term organic content strategy.

We used AdWords to drive leads quickly and to test keyword viability for the landing page content and to help inform the upcoming organic content plan.

The Results:

Just four weeks after launch, we had driven 18 leads with an average CPL of $192. For this client, a single lead has the average value of $5,000-$20,000 (and sometimes up to $100,000) in revenue. In talking with the client, we were able to uncover that within one month we had driven roughly $10,000-$75,000 in ROI. 

Takeaway for Marketers:

Don’t get discouraged by tight timelines. Hyper focus on your core marketing objectives and pivot to tactics that you know can fulfill them – even if it seems out of order. Just be sure to set expectations with your leadership team as to why you’re making a shift, what your hypothesis is and what results you anticipate.


Focus on core marketing objectives and pivot tactics that you know can fulfill them.
Click To Tweet


Interactive, Multi-Channel Campaign Resulted in 4% Lift in Market Share, 12M Media Impressions

The Strategy:

By now you’ve heard about and likely drank at least one can of Coca Cola that held the “Share a Coke” campaign branding. Did you know the campaign started in Australia? The challenge was Coke had lost its relevance among Australians leaving sales in a not so happy place.

Coca Cola added the 150 most popular names to their cans and bottles, changing their biggest piece of advertising real estate. Supporting tactics from traditional to digital platforms rolled out from there: #ShareACoke hashtag, apps, an interactive website, outdoor billboards, interactive kiosks in top city centers and more.

Customers fueled digital content for the #ShareACoke campaign.

The Results:

From the initial campaign, in Australia alone, Coke earned 12 million media impressions, a 7% increase in young adult consumption and a 4% increase in sales across the category. With this success, Coca Cola has pushed it out to nearly 60 markets since their 2011 launch and have continued to add additional tactics. One of the more recent additions aimed to turn the enthusiasm for the campaign into even more revenue and earned advertising. To achieve this, Coke has begun selling personalized bottles and gear.

Takeaway for Marketers:

B2B or B2C – A truly impactful campaign integrates with the entire customer experience. Just because your packaging department is in a different building or state from your digital advertising or SEO departments doesn’t mean you can’t or shouldn’t work together. Put your heads together across disciplines to unlock potential you never saw before.


A truly impactful marketing campaign integrates with the entire customer experience.
Click To Tweet


Consistent Publishing and Strategic Partnerships Drove 15.5% Increase in Revenue

The Strategy:

When this B2B and B2C eCommerce company came to TopRank Marketing wanting to drive sales, we knew a breadth of integrated tactics would be the way to reach their lofty revenue goals. And, we saw a huge opportunity to leverage co-created content with influencers and other brands as a way to drive stronger brand awareness. To reach their objectives, we deployed a strong marketing mix of weekly blogs, co-created influencer content, SEO, organic social, paid social and AdWords.

The Results:

In just under one year, we were able to drive a 14.4% increase in organic traffic, and 7.7% overall. The even more appetizing part of the story is these traffic spikes resulted in a 23.7% increase in organic revenue year over year; 15.5% increase in overall website revenue year over year!

Takeaway for Marketers:

A consistent cadence of relevant, SEO-driven blog content set the foundation for success for this client. And, what really made the difference was our strategic partnerships with influencers and other brands. The co-created content bolstered brand awareness in a way this brand had never before seen.


Use SEO & content to set the foundation and form strategic partnerships with influencers.
Click To Tweet


Are You Bringing Home the Bacon?

Hopefully, you just read all of that and thought, “I know. I already do all of that. I eat unlimited bacon!” If that’s you – fantastic! Are you looking for a job? We’re always open to strengthening our team!

But all joking aside, a wise marketer knows there is always more to learn. Keep up on the latest digital marketing trends and tactics by following our blog, or if you’re interested in learning what TopRank Marketing can do to help your business bring home the bacon, please, reach out today.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2017. |
3 Mouth-Watering Content Marketing Case Studies That Bring Home the Bacon | http://www.toprankblog.com

3 Mouth-Watering Content Marketing Case Studies That Bring Home the Bacon was posted via Internet Marketing

Proposing Better Ways to Think about Internal Linking

I’ve long thought that there was an opportunity to improve the way we think about internal links, and to make much more effective recommendations. I feel like, as an industry, we have done a decent job of making the case that internal links are important and that the information architecture of big sites, in particular, makes a massive difference to their performance in search (see: 30-minute IA audit and DistilledU IA module).

And yet we’ve struggled to dig deeper than finding particularly poorly-linked pages, and obviously-bad architectures, leading to recommendations that are hard to implement, with weak business cases.

I’m going to propose a methodology that:

  1. Incorporates external authority metrics into internal PageRank (what I’m calling “local PageRank”) to take pure internal PageRank which is the best data-driven approach we’ve seen for evaluating internal links and avoid its issues that focus attention on the wrong areas

  2. Allows us to specify and evaluate multiple different changes in order to compare alternative approaches, figure out the scale of impact of a proposed change, and make better data-aware recommendations

Current information architecture recommendations are generally poor

Over the years, I’ve seen (and, ahem, made) many recommendations for improvements to internal linking structures and information architecture. In my experience, of all the areas we work in, this is an area of consistently weak recommendations.

I have often seen:

  • Vague recommendations – (“improve your information architecture by linking more to your product pages”) that don’t specify changes carefully enough to be actionable

  • No assessment of alternatives or trade-offs – does anything get worse if we make this change? Which page types might lose? How have we compared approach A and approach B?

  • Lack of a model – very limited assessment of the business value of making proposed changes – if everything goes to plan, what kind of improvement might we see? How do we compare the costs of what we are proposing to the anticipated benefits?

This is compounded in the case of internal linking changes because they are often tricky to specify (and to make at scale), hard to roll back, and very difficult to test (by now you know about our penchant for testing SEO changes – but internal architecture changes are among the trickiest to test because the anticipated uplift comes on pages that are not necessarily those being changed).

In my presentation at SearchLove London this year, I described different courses of action for factors in different areas of this grid:

It’s tough to make recommendations about internal links because while we have a fair amount of data about how links generally affect rankings, we have less information specifically focusing on internal links, and so while we have a high degree of control over them (in theory it’s completely within our control whether page A on our site links to page B) we need better analysis:

The current state of the art is powerful for diagnosis

If you want to get quickly up to speed on the latest thinking in this area, I’d strongly recommend reading these three articles and following their authors:

  1. Calculate internal PageRank by Paul Shapiro

  2. Using PageRank for internal link optimisation by Jan-Willem Bobbink

  3. Easy visualizations of PageRank and page groups by Patrick Stox

A load of smart people have done a ton of thinking on the subject and there are a few key areas where the state of the art is powerful:

There is no doubt that the kind of visualisations generated by techniques like those in the articles above are good for communicating problems you have found, and for convincing stakeholders of the need for action. Many people are highly visual thinkers, and it’s very often easier to explain a complex problem with a diagram. I personally find static visualisations difficult to analyse, however, and for discovering and diagnosing issues, you need data outputs and / or interactive visualisations:

But the state of the art has gaps:

The most obvious limitation is one that Paul calls out in his own article on calculating internal PageRank when he says:

“we see that our top page is our contact page. That doesn’t look right!”

This is a symptom of a wider problem which is that any algorithm looking at authority flow within the site that fails to take into account authority flow into the site from external links will be prone to getting misleading results. Less-relevant pages seem erroneously powerful, and poorly-integrated pages that have tons of external links seem unimportant in the pure internal PR calculation.

In addition, I hinted at this above, but I find visualisations very tricky – on large sites, they get too complex too quickly and have an element of the Rorschach to them:

My general attitude is to agree with O’Reilly that “Everything looks like a graph but almost nothing should ever be drawn as one”:

All of the best visualisations I’ve seen are nonetheless full link-graph visualisations – you will very often see crawl-depth charts which are in my opinion even harder to read and obscure even more information than regular link graphs. It’s not only the sampling but the inherent bias of only showing links in the order discovered from a single starting page – typically the homepage – which is useful only if that’s the only page on your site with any external links. This Sitebulb article talks about some of the challenges of drawing good crawl maps:

But by far the biggest gap I see is the almost total lack of any way of comparing current link structures to proposed ones, or for comparing multiple proposed solutions to see a) if they fix the problem, and b) which is better. The common focus on visualisations doesn’t scale well to comparisons – both because it’s hard to make a visualisation of a proposed change and because even if you can, the graphs will just look totally different because the layout is really sensitive to even fairly small tweaks in the underlying structure.

Our intuition is really bad when it comes to iterative algorithms

All of this wouldn’t be so much of a problem if our intuition was good. If we could just hold the key assumptions in our heads and make sensible recommendations from our many years of experience evaluating different sites.

Unfortunately, the same complexity that made PageRank such a breakthrough for Google in the early days makes for spectacularly hard problems for humans to evaluate. Even more unfortunately, not only are we clearly bad at calculating these things exactly, we’re surprisingly bad even at figuring them out directionally. [Long-time readers will no doubt see many parallels to the work I’ve done evaluating how bad (spoiler: really bad) SEOs are at understanding ranking factors generally].

I think that most people in the SEO field have a high-level understanding of at least the random surfer model of PR (and its extensions like reasonable surfer). Unfortunately, most of us are less good at having a mental model for the underlying eigenvector / eigenvalue problem and the infinite iteration / convergence of surfer models is troublesome to our intuition, to say the least.

I explored this intuition problem recently with a really simplified example and an unscientific poll:

The results were unsurprising – over 1 in 5 people got even a simple question wrong (the right answer is that a lot of the benefit of the link to the new page flows on to other pages in the site and it retains significantly less than an Nth of the PR of the homepage):

I followed this up with a trickier example and got a complete lack of consensus:

The right answer is that it loses (a lot) less than the PR of the new page except in some weird edge cases (I think only if the site has a very strange external link profile) where it can gain a tiny bit of PR. There is essentially zero chance that it doesn’t change, and no way for it to lose the entire PR of the new page.

Most of the wrong answers here are based on non-iterative understanding of the algorithm. It’s really hard to wrap your head around it all intuitively (I built a simulation to check my own answers – using the approach below).

All of this means that, since we don’t truly understand what’s going on, we are likely making very bad recommendations and certainly backing them up and arguing our case badly.

Doing better part 1: local PageRank solves the problems of internal PR

In order to be able to compare different proposed approaches, we need a way of re-running a data-driven calculation for different link graphs. Internal PageRank is one such re-runnable algorithm, but it suffers from the issues I highlighted above from having no concept of which pages it’s especially important to integrate well into the architecture because they have loads of external links, and it can mistakenly categorise pages as much stronger than they should be simply because they have links from many weak pages on your site.

In theory, you get a clearer picture of the performance of every page on your site – taking into account both external and internal links – by looking at internet-wide PageRank-style metrics. Unfortunately, we don’t have access to anything Google-scale here and the established link data providers have only sparse data for most websites – with data about only a fraction of all pages.

Even if they had dense data for all pages on your site, it wouldn’t solve the re-runnability problem – we wouldn’t be able to see how the metrics changed with proposed internal architecture changes.

What I’ve called “local” PageRank is an approach designed to attack this problem. It runs an internal PR calculation with what’s called a personalization vector designed to capture external authority weighting. This is not the same as re-running the whole PR calculation on a subgraph – that’s an extremely difficult problem that Google spent considerable resources to solve in their caffeine update. Instead, it’s an approximation, but it’s one that solves the major issues we had with pure internal PR of unimportant pages showing up among the most powerful pages on the site.

Here’s how to calculate it:

The next stage requires data from an external provider – I used raw mozRank – you can choose whichever provider you prefer, but make sure you are working with a raw metric rather than a logarithmically-scaled one, and make sure you are using a PageRank-like metric rather than a raw link count or ML-based metric like Moz’s page authority:

You need to normalise the external authority metric – as it will be calibrated on the entire internet while we need it to be a probability vector over our crawl – in other words to sum to 1 across our site:

We then use the NetworkX PageRank library to calculate our local PageRank – here’s some outline code:

What’s happening here is that by setting the personalization parameter to be the normalised vector of external authorities, we are saying that every time the random surfer “jumps”, instead of returning to a page on our site with uniform random chance, they return with probabilities proportional to the external authorities of those pages. This is roughly like saying that any time someone leaves your site in the random surfer model, they return via the weighted PageRank of the external links to your site’s pages. It’s fine that your external authority data might be sparse – you can just set values to zero for any pages without external authority data – one feature of this algorithm is that it’ll “fill in” appropriate values for those pages that are missing from the big data providers’ datasets.

In order to make this work, we also need to set the alpha parameter lower than we normally would (this is the damping parameter – normally set to 0.85 in regular PageRank – one minus alpha is the jump probability at each iteration). For much of my analysis, I set it to 0.5 – roughly representing the % of site traffic from external links – approximating the idea of a reasonable surfer.

There are a few things that I need to incorporate into this model to make it more useful – if you end up building any of this before I do, please do let me know:

  • Handle nofollow correctly (see Matt Cutts’ old PageRank sculpting post)

  • Handle redirects and rel canonical sensibly

  • Include top mR pages (or even all pages with mR) – even if they’re not in the crawl that starts at the homepage

    • You could even use each of these as a seed and crawl from these pages

  • Use the weight parameter in NetworkX to weight links by type to get closer to reasonable surfer model

    • The extreme version of this would be to use actual click-data for your own site to calibrate the behaviour to approximate an actual surfer!

Doing better part 2: describing and evaluating proposed changes to internal linking

After my frustration at trying to find a way of accurately evaluating internal link structures, my other major concern has been the challenges of comparing a proposed change to the status quo, or of evaluating multiple different proposed changes. As I said above, I don’t believe that this is easy to do visually as most of the layout algorithms used in the visualisations are very sensitive to the graph structure and just look totally different under even fairly minor changes. You can obviously drill into an interactive visualisation of the proposed change to look for issues, but that’s also fraught with challenges.

So my second proposed change to the methodology is to find ways to compare the local PR distribution we’ve calculated above between different internal linking structures. There are two major components to being able to do this:

  1. Efficiently describing or specifying the proposed change or new link structure; and

  2. Effectively comparing the distributions of local PR – across what is likely tens or hundreds of thousands of pages

How to specify a change to internal linking

I have three proposed ways of specifying changes:

1. Manually adding or removing small numbers of links

Although it doesn’t scale well, if you are just looking at changes to a limited number of pages, one option is simply to manipulate the spreadsheet of crawl data before loading it into your script:

2. Programmatically adding or removing edges as you load the crawl data

Your script will have a function that loads  the data from the crawl file – and as it builds the graph structure (a DiGraph in NetworkX terms – which stands for Directed Graph). At this point, if you want to simulate adding a sitewide link to a particular page, for example, you can do that – for example if this line sat inside the loop loading edges, it would add a link from every page to our London SearchLove page:

site.add_edges_from([(edge['Source'],
'https://www.distilled.net/events/searchlove-london/')])

You don’t need to worry about adding duplicates (i.e. checking whether a page already links to the target) because a DiGraph has no concept of multiple edges in the same direction between the same nodes, so if it’s already there, adding it will do no harm.

Removing edges programmatically is a little trickier – because if you want to remove a link from global navigation, for example, you need logic that knows which pages have non-navigation links to the target, as you don’t want to remove those as well (you generally don’t want to remove all links to the target page). But in principle, you can make arbitrary changes to the link graph in this way.

3. Crawl a staging site to capture more complex changes

As the changes get more complex, it can be tough to describe them in sufficient detail. For certain kinds of changes, it feels to me as though the best way to load the changed structure is to crawl a staging site with the new architecture. Of course, in general, this means having the whole thing implemented and ready to go, the effort of doing which negates a large part of the benefit of evaluating the change in advance. We have a secret weapon here which is that the “meta-CMS” nature of our ODN platform allows us to make certain changes incredibly quickly across site sections and create preview environments where we can see changes even for companies that aren’t customers of the platform yet.

For example, it looks like this to add a breadcrumb across a site section on one of our customers’ sites:

There are a few extra tweaks to the process if you’re going to crawl a staging or preview environment to capture internal link changes – because we need to make sure that the set of pages is identical in both crawls so we can’t just start at each homepage and crawl X levels deep. By definition we have changed the linking structure and therefore will discover a different set of pages. Instead, we need to:

  • Crawl both live and preview to X levels deep

  • Combine into a superset of all pages discovered on either crawl (noting that these pages exist on both sites – we haven’t created any new pages in preview)

  • Make lists of pages missing in each crawl and crawl those from lists

Once you have both crawls, and both include the same set of pages, you can re-run the algorithm described above to get the local PageRanks under each scenario and begin comparing them.

How to compare different internal link graphs

Sometimes you will have a specific problem you are looking to address (e.g. only y% of our product pages are indexed) – in which case you will likely want to check whether your change has improved the flow of authority to those target pages, compare their performance under proposed change A and proposed change B etc. Note that it is hard to evaluate losers with this approach – because the normalisation means that the local PR will always sum to 1 across your whole site so there always are losers if there are winners – in contrast to the real world where it is theoretically possible to have a structure that strictly dominates another.

In general, if you are simply evaluating how to make the internal link architecture “better”, you are less likely to jump to evaluating specific pages. In this case, you probably want to do some evaluation of different kinds of page on your site – identified either by:

  1. Labelling them by URL – e.g. everything in /blog or with ?productId in the URL

  2. Labelling them as you crawl

    1. Either from crawl structure – e.g. all pages 3 levels deep from the homepage, all pages linked from the blog etc)

    2. Or based on the crawled HTML (all pages with more than x links on them, with a particular breadcrumb or piece of meta information labelling them)

  3. Using modularity to label them automatically by algorithmically grouping pages in similar “places” in the link structure

I’d like to be able to also come up with some overall “health” score for an internal linking structure – and have been playing around with scoring it based on some kind of equality metric under the thesis that if you’ve chosen your indexable page set well, you want to distribute external authority as well throughout that set as possible. This thesis seems most likely to hold true for large long-tail-oriented sites that get links to pages which aren’t generally the ones looking to rank (e.g. e-commerce sites). It also builds on some of Tom Capper’s thinking (videoslides, blog post) about links being increasingly important for getting into Google’s consideration set for high-volume keywords which is then reordered by usage metrics and ML proxies for quality.

I have more work to do here, but I hope to develop an effective metric – it’d be great if it could build on established equality metrics like the Gini Coefficient. If you’ve done any thinking about this, or have any bright ideas, I’d love to hear your thoughts in the comments, or on Twitter.

Proposing Better Ways to Think about Internal Linking was posted via Internet Marketing

How to Choose Dynamic Images for Your Blog Posts

I’m a content writer, not a graphic designer. My job is to make the words dance, to convey useful information in an entertaining way.

As such, for a long time visuals were just an afterthought for me. Yeah, a blog needs a header image. So after I’m done writing I’ll slap something on there, check that box, and send it off to the client.

As content continues to proliferate, though, that laissez-faire approach isn’t enough. Your potential audience has far more content available to them than they’ll ever be able to read. That means they’re actively looking for reasons not to read your content. A weak—or worse, missing—visual is a perfect excuse to move to the next thing.

The right visual does more than take up space. It captures attention, creates a little mystery, invites the reader to dig into your carefully-crafted text. Good visuals are doubly important for amplification, too: Your Twitter, Facebook, and LinkedIn shares will all include an image. The visual alone can stop the endless, half-engaged scrolling people do on social media, buying you crucial seconds to compel a click or a tap.

I challenge any and all content creators to up their image game. Let’s stop with the schlocky stock photos and give people something that’s worth their attention.

Here’s how I find scroll-stopping visuals for my blog posts.

Ditch the Schlock Stock

It’s trendy to bash Shutterstock for schlocky stock photos, but that’s like blaming Netflix for your binge-a-thon of Fuller House. There’s plenty of great content available. It’s up to you to find and choose it over the cliché stuff.

Whether you’re using Shutterstock or any other paid photo site, start by avoiding these cliché photo types:

  • Minority Report Computer Displays. Seems like every B2B blog is required to use one of these nonsensical things at least twice a week.
    businessman using futuristic computer interface
  • Stark White Offices. It’s futuristic! It’s so clean! It… looks like no place anyone has ever worked!
    people gathered in stark white office building
  • People with Arms Crossed. Do you pose for pictures like this? Does anyone? Then why are there thousands of these on stock photo sites?
    man with arms folded
  • Cupped Hands with Floating Icons. Sing it with me: “He’s got the [abstract concept of my blog post] in his hands…”
    businessperson holding floating icons in cupped hands
  • Anything in front of a Chalkboard. STAHP.
    Businessman in front of chalkboard with muscular arms drawn in

I could go on, but you get the idea. These are the hoary clichés that give stock photos a bad name. They’re not unique; they’re not authentic; they’re not visually stunning.

To avoid the stock photo blues, I tend to start my search on royalty-free sites like Pixabay, Pexels, and even Creative Commons-licensed photos on Flickr. But even if the boss demands you use an approved paid site, there’s good stuff to be found.  Here are a few ways to kick your visuals up a notch.

Make It Weird

For my blog post on mobile advertising strategy, there were plenty of obvious ways to go. Someone looking at a phone in a coffee shop, at an airport, at a concert… people look at their phones everywhere, so there are no shortage of safe options.

So of course I went with this one:

Visual Content Marketing Dog with Sunglasses and Cell Phone

Why is the dog wearing sunglasses? What type of phone has a pawprint for the unlock button? Why didn’t he use the front-facing camera for his selfie? Any one of those questions is enough to give the reader paws. Er, pause.

Make It Beautiful

Instagram is a social media network that’s almost entirely visual. It was designed for image sharing, boy howdy, do its members share. There have been over 40 billion photos posted on Instagram since it launched 7 years ago.

So it makes sense to take a few design cues from Instagram when you choose your photos. Find something beautiful, striking, and with an evocative filter. Like this image I used for my comedy in content post:

Visual Content Marketing Clown in Forest with Instagram-Style Filter 

Find a Metaphor

Get a little creative with your content, and you can get more creative with your visuals. Introduce a metaphor in your opening paragraph that will unite your content and give you more options for a header image.

For a recent content marketing tips post, I could have stuck with a generic “businessperson” or “office” header image. Instead, I added a personal note about Lego in the beginning, and found a dynamite visual that helped introduce the metaphor:

Visual Content Marketing - Colorful Assortment of Lego Bricks

Take Your Own Photos

The best way to ensure your header is original, authentic, and eye-catching is to take the photo yourself. Last year, Jason Miller held a photoshoot with his LinkedIn Marketing Solutions crew. They captured a ton of wonderful moments that the team used as header images for months:

LinkedIn Marketing Solutions Team around Laptop

I love that even though this image is a parody of a stock photo, it’s undeniably original. You can see the cool art in the office. The people are actually the folks who create content for LinkedIn. The laptop is a well-loved machine with a LinkedIn sticker on it, not a pristine stainless-steel model. Unlike a stock photo, this picture actually tells you about the people behind the brand.

Even a cell-phone quality image can get the job done. When our team covers marketing events, we always take a candid photo of the presenter as the header image. My colleague Caitlin took it a step further for her Ann Handley roundup, with this adorable selfie:

Visual Content Marketing Selfie with Ann Handley

It’s genuine, it’s unexpected, and it’s a photo the reader is guaranteed to be seeing for the first time.

As with Written Content, It’s about Personality

It used to be that all B2B marketing content had to be “professional,” interpreted as “impersonal, flat, and unemotive.” Old-school stock photos are a perfect match for that kind of content. Here’s a guy in a suit standing with his arms folded. Here’s our white paper written like a software end-user license agreement.

Now we know better. Readers want content that has warmth and personality. They want to feel that another human being is communicating with them.

Visuals need to evolve in the same way. If you’re writing great content and still using stiff, stock images, you’re doing your content a disservice. Make sure your visuals are every bit as distinctive and authentic as your writing is, and you can earn your reader’s attention.

Do you love to create great content? Do you excel at eye-stopping imagery? TopRank Marketing needs you on our team.

Disclosure: LinkedIn Marketing Solutions is a TopRank Marketing client.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2017. |
How to Choose Dynamic Images for Your Blog Posts | http://www.toprankblog.com

How to Choose Dynamic Images for Your Blog Posts was posted via Internet Marketing

How to Make the Switch to Content-Driven SEO #MNBlogCon

TopRank Marketing’s Joshua Nite made his debut on the speaker circuit this past weekend at the 8th annual Minnesota Blogger Conference held at Concordia University in St. Paul.

Charming the crowd with his unique brand of wit, creativity, mad content marketing expertise, and numerous “cats with hats” references, Josh delivered a The Good Place-themed presentation titled: “The Good News About Creative Content: From SEO-Driven Content to Content-Driven SEO.”

As someone who spent 12 years as a creative comedy writer for a video game called The Kingdom of Loathing, Josh said he was terrified by the concept of SEO-driven content when he made his transition into content marketing.

“The worst content to write, and the worst content for people to read, was the stuff that [search engine] robots liked to read to most,” Josh said.

But thankfully, search engines are getting smarter, using AI and machine learning to increasingly improve how they deliver the best results. As a result, content creators need to flip the script on how they craft content if they want to resonate with readers and robots. From Josh’s point of view, that means transitioning from SEO-driven content to content-driven SEO.

How? Below is Josh’s five-step framework.

#1 – Topic research.

Get started by digging deep into your target audience. Why? Because in order to craft content that resonates, you have to understand what they care about. Ask yourself the following questions:

  • Who are they? (i.e. demographics, hobbies, interests, etc.)
  • What do they desperately need to know? (And what keywords and keywords groups are associated?)
  • Where do they hang out online? (i.e. social media)
  • Why should they care about your content? (What value can you add?)
  • How do they search for inspiration? (i.e. Google, Bing, Q&A forums, etc.)

From there, you need to identify your sweet spot. Your sweet spot is the intersection of: 1) Your brand’s expertise. 2) Your audience’s needs. 3) Your unique insights.

Finally, leverage free and paid tools such as Google auto-complete, Google Keyword Planner, Quora, Answer The Public, and BuzzSumo to understand specific keyword topics that resonate most with your audience.


To craft #content that resonates, you have to know what your audience cares about.
Click To Tweet


#2 – Competitor research.

Simply put, in order to beat out your competition, you need to know what they’re up to. Kick off your competitive research by simply “going incognito,” Josh said.

An incognito search prevents your browser history or cache from impacting the results, giving you a more accurate picture of the search results surrounding your priority keyword topics.

After popping in your keywords, scan the results for content gaps—gaps in quality, relevant, or helpful content. As you do this, look for opportunities to expand your keywords into long-tail variations, so you can get more specific and really let your niche expertise shine.

#3 – Content creation.

Now the fun part comes. Using your topical and competitive research, outline your concepts and document your content mission (i.e. increase ranking for “X” keyword by 10 positions in one month). Then get to work on crafting your piece.

#4 – A smattering of HTML.

As you craft your content, you need to be thinking about how you’ll organize that content on-page, as well as send “click me” signals to searchers. This involves working in some of the technical on-page SEO elements. The top three that need consideration include:

  • Title tags: This is the title searchers will see in the SERPs. Keep it to 600 pixels long so it doesn’t get truncated. In addition, aim to have the primary keyword near the beginning, as long as it makes sense.
  • Header tags: Use H1 and H2 tags to organize your content to make it easy to scan for readers and robots.
  • Meta description: From Josh’s perspective, this is the most overlooked, yet crucial part of SEO infrastructure. “This is your one shot to hook users,” he said. Keep it to 160 characters or less, include your target keyword if it makes sense, and state the clear benefit.

#5 – Optimization.

You’ve spent a lot of time getting that piece of content out the door. But fight the urge to move on and never touch it again. As Josh so eloquently said, “The real work begins after you publish.”

So, keep an eye on your analytics. Is your content getting a good amount of impressions but not a ton of clicks? Consider refining the meta description a bit. Are you getting impressions and clicks, but the bounce rate is high? Your readers may feel like they’re not getting what they were promised or there’s no clear call to action to keep them on your site. So refine the meta description and craft a more compelling CTA.

Again, you poured a lot of effort into getting this content published—so don’t let that effort be wasted. Always be on the lookout for opportunities to tweak the content and the SEO elements to improve its resonance.


The real work begins after your publish. – @NiteWrites #contentmarketing
Click To Tweet


Don’t Settle

Josh summed it all up perfectly in the final moments of his presentation:

“There’s never been a better opportunity to write great content that people actually want to read and that will get seen in search results,” Josh said. “So, go forth and be awesome. And please, please—don’t settle for writing crappy content.”


Please, please—don’t settle for writing crappy content. – @NiteWrites #contentmarketing
Click To Tweet


What does your creative content creation process look like? Tell us in the comments section below.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2017. |
How to Make the Switch to Content-Driven SEO #MNBlogCon | http://www.toprankblog.com

How to Make the Switch to Content-Driven SEO #MNBlogCon was posted via Internet Marketing

Digital Marketing News: Twitter Happening Now, Snapchat Context Cards, LinkedIn Video Ads

LinkedIn Video Ads

Video Ads Are Finally Coming to LinkedIn (client) – When LinkedIn began allowing users to upload videos in August, video ads seemed like an inevitability, and they are now one step closer. LinkedIn announced today that it is running a closed beta test of video for sponsored content “with a limited number of advertisers.”  AdWeek

Twitter Plans To Release A Bookmarking Tool #SaveforLater. You know how you can save posts to read later in Facebook? Well, Twitter is looking to do the same thing. For all of you liking posts as a way to bookmark, you can stop that practice with this new feature. Will this mean likes will go down? Probably. BuzzFeed

New Research: The state of marketing attribution – A growing number of marketers are using attribution in all or most of their marketing efforts, according to a recent study from Econsultancy and AdRoll. However, the number of marketers acting on the insights they pull from attribution data is dwindling. Econsultancy

Snapchat Introduces “context cards” – Snapchat released ‘Context Cards’ this week, which have the potential to bolster marketing efforts for restaurants, venues and other destinations. These cards will pull in information based on the Snap’s geo-filters and map information that will lead viewers to online reviews, Uber and Lyft information and more. TechCrunch

AdWords Charges & Your Daily Budget – If you’ve been struggling to reach your advertising goals, AdWords has made some recent changes to help get you over the hump. As of October 4th, campaigns are now able to spend up to twice the average daily budget. Don’t fret about racking up the costs at the end of the month as you will not be charged more than your monthly charging limit. Google

Twitter Happening Now – Twitter is adding a “Happening Now” feature that will group tweets by event, the company announced today. The feature, which will start with sports games, is yet another way the company is seeking to highlight information on its platform outside of the traditional follow model. Buzzfeed

Social media monitor Brandwatch acquires content marketing platform BuzzSumo –  Two things that are great on their own are not often better together, but that’s exactly what the marketing industry expects from the combination of BuzzSumo and Brandwatch. TechCrunch

Connect the Dots from Data to Better Customer Experiences – Join me and Michael Trapani of IBM on October 26th for a free webinar to better understand the opportunities around creating best answer experiences with cognitive technologies. IBM Watson

Smart Speaker Commerce

NEWS NUGGETS

Infographic: YouTube has grown to 1.5 billion monthly active users – MarketingProfs

LinkedIn connects sales, marketing tools for B2B advertisers to target leads, accounts – MarTech Today

70% of Brands Work with Instagram Influencers – Research Brief

Majestic and SEMRush Combine Forces – Majestic Blog

Bing Ads Launches Automated Bid Strategy to ‘Maximize Clicks’ – Search Engine Land

As Voice Has Its Moment, Amazon, Google and Apple Are Giving Brands a Way Into the Conversation – AdWeek

New Study from D&B Shows What Frustrates B2B Buyers Most – MarketingProfs

70% of Marketers Do Not Use Anonymized Consumer Identity Data But 75% Say it Helps Campaign Optimization – MediaPost

63% of Amazon Advertisers Plan to Spend Even More Over the Next Year – AdWeek

What was the top digital marketing news story for you this week?

Be sure to stay tuned until next week when we’ll be sharing all new marketing news stories. Also check out the full video summary with Tiffani and Josh on YouTube.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2017. |
Digital Marketing News: Twitter Happening Now, Snapchat Context Cards, LinkedIn Video Ads | http://www.toprankblog.com

Digital Marketing News: Twitter Happening Now, Snapchat Context Cards, LinkedIn Video Ads was posted via Internet Marketing

The SEO Apprentice’s Toolbox: Gearing Up for Analysis

Being new to SEO is tricky. As a niche market within a niche market there many tools and resources unfamiliar to most new professionals. And with so much to learn it is nearly impossible to start real client work without first dedicating six months exclusively to industry training. Well…that’s how it may seem at first.

While it may be intimidating, investigating real-world problems is the best way to learn SEO. It exposes you to industry terminology, introduces you to valuable resources and gets you asking the right questions.

As a fairly new Analyst at Distilled, I know from experience how difficult it can be to get started. So here’s a list of common SEO analyses and supporting tools that may help you get off on the right foot.

Reviewing on-page elements

Page elements are essential building blocks of any web page. And pages with missing or incorrect elements risk not being eligible for search traffic. So checking these is necessary for identifying optimization opportunities and tracking changes. You can always go to the HTML source code and manually identify these problems yourself, but if you’re interested in saving a bit of time and hassle, Ayima’s Google Chrome extension Page Insights is a great resource.

This neat little tool identifies on-page problems by analyzing 24 common on-page issues for the current URL and comparing them against a set of rules and parameters. It then provides a list of all issues found, grouped into four priority levels: Errors, Warnings, Notices and Page Info. Descending from most to least severe, the first 3 categories (Errors, Warnings & Notices) identify all issues that could impact organic traffic for the page in question. The last category (Page Info) provides exact information about certain elements of the page.

For every page you visit Page Insights will give a warning next to its icon, indicating how many vulnerabilities were found on the page.

Clicking on the icon gives you a drop-down listing the vulnerabilities and page information found.

What makes this tool so useful is that it also provides details about each issue, like how it can cause harm to the page and correction opportunities. In this example, we can see that this web page is missing an H1 tag, but in this case, could be corrected by adding anH1 tag around the page’s current heading (which is not coded as an H1).

In a practical setting, Page Insights is great for quickly identify common on-page issues that should be fixed to ensure best SEO practice.

Additional tools for reviewing on-page elements:

Supplemental readings:

Analyzing page performance

Measuring the load functionality and speed of a page is an important and common practice since both metrics are correlated with user experience and are highly valued by search engines. There are a handful of tools that are applicable to this task but because of its large quantity of included metrics, I recommend using WebPagetest.org.

Emulating various browsers, this site allows users to measure the performance of a web page from different locations. After sending a real-time page request, WebPagetest provides a sample of three tests containing request details, such as the complete load time, the load time breakdown of all page content, and a final image of the rendered page. There are various configuration settings and report types within this tool, but for most analyses, I have found that running a simple test and focusing on the metrics presented in the Performance Results supply ample information.

There are several metrics presented in this report, but data provided in Load Time and First Byte work great for most checks. Factoring in Google’s suggestion to have desktop load time no greater than 2 seconds and a time to first byte of 200ms or less, we can gauge whether or not a page’s speed is properly optimized.

Prioritizing page speed performance areas

Knowing if a page needs to improve its performance speed is important, but without knowing what areas need improving you can’t begin to make proper corrections. Using WebPagetest in tandem with Google’s PageSpeed Insights is a great solution for filling in this gap.

Free for use, this tool measures a page’s desktop and mobile performance to evaluate whether it has applied common performance best practices. Scored on a scale of 0-100 a page’s performance can fall into one of three categories: Good, Needs Work or Poor. However, the key feature of this tool, which makes it so useful for page speed performance analysis, is its optimization list.

Located below the review score, this list highlights details related to possible optimization areas and good optimization practices currently in place on the page. By clicking the “Show how to fix” drop down for each suggestion you will see information related to the type of optimization found, why to implement changes and specific elements to correct.

In the image above, for example, compressing two images to reduce the amount bytes that need to be loaded can improve this web page’s speed. By making this change the page could expect a reduction in image byte size by 28%.

Using WebPagetest and PageSpeed Insights together can give you a comprehensive view of a page’s speed performance and assist in identifying and executing on good optimization strategies.

Additional tools for analyzing page performance:

Supplemental readings:

Investigating rendering issues

How Googlebot (or Bingbot or MSNbot) crawls and renders a page can be completely different from what is intended, and typically occurs as a result of the crawler being blocked by a robots.txt file. If Google sees an incomplete or blank page it assumes the user is having the same experience and could affect how that page performs in the SERPs. In these instances, the Webmaster tool Fetch as Google is ideal for identifying how Google renders a page.

Located in Google Search Console, Fetch as Google allows you to test if Googlebot can access pages of a site, identify how it renders the page and determines if any resources are blocked from the crawler.

When you look up a specific URL (or domain) Fetch as Google gives you two tabs of information: fetching, which displays the HTTP response of the specified URL; and rendering, which runs all resources on the page, provides a visual comparison of what Googlebot sees against what (Google estimates) the user sees and lists all resources Googlebot was not able to acquire.

For an analysis application, the rendering tab is where you need to look. Begin by checking the rendering images to ensure both Google and the user are seeing the same thing. Next, look at the list to see what resources were unreachable by Googlebot and why. If the visual elements are not displaying a complete page and/or important page elements are being blocked from Googlebot, there is an indication that the page is experiencing some rendering issues and may perform poorly in the search engine.

Additional tools for investigating rendering issues:

Supplemental readings:

Checking backlink trends

Quality backlinks are extremely important for making a strong web page, as they indicate to search engines a page’s reliability and trustworthiness. Changes to a backlink profile could easily affect how it is ranked in the SERPs, so checking this is important for any webpage/website analysis. A testament to its importance, there are several tools dedicated to backlinks analytics. However, I have a preference for the site Ahrefs due to its comprehensive yet simple layout, which makes it great for on-the-spot research.

An SEO tool well known for its backlink reporting capabilities, Ahrefs measures several backlink performance factors and displays them in a series of dashboards and graphs. While there is plenty to review, for most analysis purposes I find the “Backlinks” metric and “New & lost backlinks” graph to be the best places to focus.

Located under the Site Explorer tab, “Backlinks” identifies the total number of backlinks pointing to a target website or URL. It also shows the quantitative changes in these links over the past 7 days with the difference represented by either a red (negative growth) or green (positive growth) subscript. In a practical setting, this information is ideal for providing quick insight into current backlink trend changes.

Under the same tab, the “New & lost backlinks” graph provides details about the total number of backlinks gained and lost by the target URL over a period of time.

The combination of these particular features works very well for common backlink analytics, such as tracking backlinks profile changes and identifying specific periods of link growth or decline.

Additional tools for checking backlink trends:

Supplemental readings:

Creating your toolbox

This is only a sample of tools you can use for your SEO analyses and there are plenty more, with their own unique strengths and capabilities, available to you. So make sure to do your research and play around to find what works.

And if you are to take away only one thing from this post, just remember that as you work to build your own personal toolbox what you choose to include should best work for your needs and the needs of your clients.

The SEO Apprentice’s Toolbox: Gearing Up for Analysis was posted via Internet Marketing

Will More Tweet Space Equal More Value for Your Twitter Audience?

Last month, Twitter made big headlines after announcing it was in the midst of testing 280-character tweets as a way to give users more room to “express” themselves. The announcement came a little more than a year after Twitter stopped including links and photos in character counts.

“We want every person around the world to easily express themselves on Twitter, so we’re doing something new: we’re going to try out a longer limit, 280 characters, in languages impacted by cramming (which is all except Japanese, Chinese, and Korean),” the company said in a press release on its blog. “Although this is only available to a small group right now, we want to be transparent about why we are excited to try this.”

For marketers, many may feel like Christmas has come early. Let’s face it, writing a compelling and comprehensive tweet in just 140 characters is an art — an art that seems almost impossible to master. With double the amount of space, the pressure is off and marketers can unleash their full wordsmithing talent. Um, right?

Not so fast.

Twitter’s 140-character limit has been a defining platform characteristic since its inception — and something many users are extremely partial to.

“Twitter is about brevity. It’s what makes it such a great way to see what’s happening. Tweets get right to the point with the information or thoughts that matter. That is something we will never change,” Twitter said in its release. “We understand since many of you have been Tweeting for years, there may be an emotional attachment to 140 characters — we felt it, too.”

While Twitter is confident that giving users more real estate will make it easier and more fun to tweet, marketers should not look at it as an opportunity to rewrite their tweeting best practices. The real opportunity here is to discover whether or not you can use that extra space to deliver more value and resonance to your audience.


Twitter’s character limit change is an opportunity to learn if you can deliver more value. #marketing
Click To Tweet


So, once “super-sized” tweets — as The Verge so eloquently called them — come to your account, don’t throw caution to the wind right away. Start with these actions:

#1 – Audit your existing Twitter initiatives.

Take a deep dive into your analytics dashboard to get a deeper understanding of how your audience is already engaging with your tweets and taking action on them.

Of course, the basic metrics are important because they can serve as your benchmarks. But also go beyond the metrics to start categorizing what content garners the most engagement so you can draw some more meaningful insights. For example, what topics seem to fire your audience up? How long are your most effective tweets? Are images or video a part of your most successful tweets? Which tweets featuring my website content got the most clicks? What really seems to be working? What’s clearly not working?

In addition, it’s worth taking a peek at your website analytics to understand how Twitter is impacting your business. Depending on what you uncover through the Twitter dashboard, you might be able to draw some more conclusions on what tweet content has value beyond awareness and engagement.


Before adding characters, audit your current Twitter efforts. #socialmediamarketing
Click To Tweet


#2 – Craft and launch test tweets.

Use the information you uncovered during your audit to build out and launch a test campaign featuring longer tweets. Of course, build these tweets in accordance with what you know is working best with your audience, but also give yourself some space to experiment a bit. We’d suggest running the test for at least a month to get enough data to lead into the next action.


Test longer tweets before throwing out Twitter best practices. #socialmediamarketing
Click To Tweet


#3 – Analyze results and tweak your test.

Now it’s time to dive back into your analytics to understand how your test tweets stack up to your legacy efforts. Did you see a measurable rise or decline in engagement? What kind of engagement did you receive (i.e. increase in average comments or decrease in average retweets)? Was there a certain type of content that really benefited from that extra character room?


After you test longer tweets, analyze your results & make tweaks. #socialmediamarketing
Click To Tweet


The Bottom Line: Value Trumps Character Count

At the end of the day, character count simply doesn’t matter if what you’re sharing has no value or resonance with your audience. Since Twitter launched, the tight character count has been a creative restraint, challenging us all to say more with less. So, while you should certainly take advantage of the extra room when it makes sense, your primary objective should always be bringing insight and value to your audience. Because when they see the value you bring to the table, they’ll reward you for it.


Your primary objective should always be to bring value to your audience. @CaitlinMBurgess
Click To Tweet


What do you think about Twitter’s decision to double its character limit? Tell us in the comments section below.


Email Newsletter
Gain a competitive advantage by subscribing to the
TopRank® Online Marketing Newsletter.

© Online Marketing Blog – TopRank®, 2017. |
Will More Tweet Space Equal More Value for Your Twitter Audience? | http://www.toprankblog.com

Will More Tweet Space Equal More Value for Your Twitter Audience? was posted via Internet Marketing