#was it Pizza Hut? it might have been dominoes or some other chain
Explore tagged Tumblr posts
kaninchen-reblogs · 2 years ago
Note
so why exactly did you install a period tracker mod in morrowind?
It’s not actually a mod, that’s part of the Bloodmoon expansion
0 notes
pizzaplaceorlando1-blog · 6 years ago
Text
Pizza Places to eat
Pizza place orlando
Pizza is a very common type of food constructed from flat oven baked bread and tomatoes that people love. Places to eat where you may get pizzas are popular spot. The atmosphere is often entertaining in a pizza spot you can loosen up and be serenaded with fantastic new music. A lot of of the pizza eating places have game titles you could engage in although waiting for the food stuff to be served. Chuck E. Cheese is really a excellent pizza put to just take young ones since they might get tokens to perform games also.
pizza place lake nona
Apart from the video games, on offer are fantastic appetizers like breadsticks and salad to warm your urge for food right until your food items is prepared for being served. Some pizza eating places offer you other items like pasta dishes and calzones. Just take a look at their menu to see the selections they've. Dessert pizza is actually a latest option that may be also a great advertising item.
Pizza dining places can also be referred to as pizzerias. In Usa, chances are you'll listen to pizza parlor or pizza shop. Massive Pizza dining establishments now present a buffet for lunch rush to be able to serve people swiftly. This also will help any time you have a very team of men and women that have extremely diverse preferences in how they pizza should be garnished. In case you have some hearty eaters in the relatives this could also assistance to scale back the expense of food items whilst you dine out.
There are several common chain pizza eating places on the market. The biggest as well as the most costly is Pizza Hut. This has also been a supply of concern for purchasers. A few other common plus much more inexpensive pizzerias involve Minor Caesars and Dominoes. Papa Johns is actually a frequent a single however you really have to get the pizza contemporary geared up and then you take it dwelling to bake it.
You'll find other pizzas eating places all around that could not be nationally identified but where you can get excellent delicious pizzas from. They are fondly known as mother and pop pizza dining establishments.
Park Pizza is a delicious pizza place and restaurant located in Orlando. Park Pizza & Brewing Co. is Lake Nona�s home away from home; a welcoming retreat for the entire community. We serve up wood-fired pizza and house-brewed beer, plus a sense of neighborhood pride.
1 note · View note
scottspizzatours · 7 years ago
Text
Why I Celebrated 10 Years of Pizza Touring with a Pizza Hut Personal Pan Pizza
Tumblr media
Tuesday was the new Friday when I was growing up in suburban New Jersey in the 1990s. That’s the night my family would visit our local Pizza Hut, conveniently located directly in the middle of my hometown. It was one of those vintage red-roofed hut-shaped structures we now see disguised as burger joints, bookstores, and veterinary clinics (thanks to the blog Used to Be a Pizza Hut for documenting them). I remember so much about that Pizza Hut from the salad bar in the center to the textured red Coca Cola cups to the stained glass light fixtures to the terribly slow service. We had at least half a dozen independent pizzerias in this 25,000 person commuter town, which we patronized regularly, but Pizza Hut was for special occasions. It was a pizza restaurant, not a slice shop like all the others. A visit there was reserved for special occasions: Tuesdays.
I don’t know if it was the policy of the Pizza Hut company or of my local location, but Tuesday night was specially designated Book It night. This was Pizza Hut’s program that awarded a free personal pan pizza to any participant who read five books in one month. Every participant got a neat hologram pin with five blank spaces in which we paced a gold star for every book we read. It was a brilliant ploy to bring in bodies on an otherwise slow night and it totally worked. Most tables had at least one kid enjoying a personal pan pizza while the rest of their families paid full price. There was even a free toy when Pizza Hut had a media partnership to tout. The only one I clearly remember was the pair of futuristic sunglasses that somehow promoted Back to the Future II. No, I don’t think I have them anymore.
Those visits to Pizza Hut stuck with me, as I’m sure they did with most kids whose schools made the devil’s pact with Pizza Hut. My personal pan pizza consumption definitely trailed off when I aged out of the Book It program and I honestly don’t remember the last time I had one. Come to think of it, I hadn’t really eaten much Pizza Hut pizza since the late 1990s with he exception of one time my friends and I ordered a “Four-by-Four” meal (I think it was four pizzas, four orders of bread sticks, four liters of soda, and four trips to the cardiologist). I knew the company had been sold to Pepsi Co by the Carney brothers, who founded the company in 1958, and again to Yum Brands. The magic vibe was missing so I just wrote them off as a piece of the past that will never be the same again.
As I starting giving pizza tours of NYC and writing for pizza magazines and all that good stuff, one of the most frequently asked questions was that of the big chain pizzerias. People asked me every day which I prefer out of the Big Four (Pizza Hut, Domino’s, Papa John’s, and Little Caesar’s). Since I eat at some of the greatest pizzerias on Earth on a daily basis, there’s just not a lot of room in my life for the big chains. I briefly worked at Domino’s back in 2012 just because I was curious, I met “Papa” John Schnatter in 2015 as part of a food blogger event, and I even visited Little Caesar’s Store #001 last year as part of a Detroit pizza research trip; but Pizza Hut was a blind spot on for me since the last time I had it was easily 15 years ago. I’ve been tempted on multiple occasions, especially since there’s a Pizza Hut counter in the Target right around the corner from my apartment, but the time was never right... until April 27, 2018.
Tumblr media
Almost exactly six months after I rented a school bus and took 25 of my friends on a citywide pizza-eating adventure, I launched my company Scott’s Pizza Tours. My first official tour rolled on April 27, 2008. We hit four pizzerias across Manhattan and the Bronx and it was magical. But I was terrified. I had never owned a business, never ran a business, never wanted to run a business. I just wanted to eat pizza. I had a blast every time I ran a tour, but I was always terrified that I wasn’t qualified to tell anybody anything about something simple like pizza. I named the company Scott’s Pizza Tours because I wanted to be clear that this tour was my own personal perspective. A more generic name might imply that I was claiming some higher understanding or superiority and I desperately did not want to do that. I was only 26 years old when I started running tours and knew that plenty of my customers (if anybody even showed up) would be much older than me and therefore far more experienced in terms of quantity of pizza consumed. That’s why I did my best to research the objective points of pizza history so that the story I’d tell would be about the facts and not about the subjectivity of food preferences.
Running tours was a blast. I had more fun than I ever imagined and did my best to stay ten steps ahead of my tour guests’ questions. I studied so much that people were shocked when I’d respond to the question I’ve gotten on almost every single tour, “How long have you been doing this?” I remember answering, “Six months,” and wondering what it might be like when and if I hit three years, five years, or even longer! I keep data on every tour and wondered if I’d ever reach the point of being able to say I’d run 500 tours. Now when I tell people I’ve been doing this for over a decade and I’ve personally done over 2,500 tours, they don’t believe it. Then they ask my favorite question: “What’s your real job?” I’m having too much fun for this to be my real job, but that’s exactly what it is.
Tumblr media
I knew I had to eat pizza on my tenth anniversary. Would I eat it from one of the pizzerias I frequent on tours? Would I go to a pizzeria I’ve had on my to-do list for the past decade? Would I take a trip to one of the famous pizzerias beyond the five boroughs of NYC? No. I decided to go to Target. Why take the chance of disappointing myself with a new pizza? Why do the boring thing and have a slice I had the day before? Why take the chance of pissing off all the pizzeria owners whose pizza I didn’t celebrate with when I could piss of ALL the pizzeria owners by checking in with an old friend? I knew what I had to do.
When I realized I was waiting in the PICK UP line, I scooted over to the ORDER HERE line and placed my request for a personal pepperoni pizza. I watched as it was slapped together, a real paint-by-numbers situation. It took about six minutes to roll across the conveyor belt oven (I timed it) and my walk home took another eight. I sat down in my living room and ate two slices. I didn’t expect it to be like the pizza I ate on Tuesday nights as a kid. I ate it because it felt like returning to my roots was the right thing to do. I’m glad I chose to celebrate 10 years of pizza touring with a personal pepperoni pizza from Pizza Hut. Now I know I never have to eat one again.
16 notes · View notes
mitchamsocialuser · 3 years ago
Text
Australian Restaurant Chains
Have you ever thought of trying out an Australian restaurant? Well, the country has a lot to offer to visitors and is one of the most popular international tourist destinations. It’s also home to some incredible culinary delights, so if you’re planning on visiting Australia, here are some tips for the best food in Australia. Many Australian restaurants sell authentic Australian cuisine, which includes barramundi, marinated lamb, and authentic Australian barbecue. See listing of tips below, or utilize Australia travel guide to generate your own list of restaurant names.
If you want to go to Sydney, Australia, you’d be better off starting your search at a good Sydney restaurant chain. For instance, Bondy’s Irish pub and restaurant chain offer authentic Irish food. Or if you prefer something more American, try the Tommy’s American restaurant chain, which is branching into other countries such as India and Mexico. There are many other great Sydney restaurant chains, so when you’re looking to book a vacation rental in Sydney, it pays to do a little research beforehand.
One of the best ways to determine which restaurant to try in Sydney is based on its reputation in the local community. For example, if you see people talking about how bad the food is in one of the local restaurants, you might consider going somewhere else. Even though most tourists come to Australia specifically to visit the Sydney Opera House and the Sydney Tower, there are still good restaurants around the area. If the local reviews mention that you should try out some local restaurants before visiting Australia, you’ll at least be able to narrow down your search.
There are a few different types of restaurants in Australia. There are chain restaurants that many people know about, such as McDonald’s, KFC, Pizza Hut, Domino’s Pizza, the Regent Hotel, and the Woolworths Hotel. However, there are also regional Australian restaurants, independent cafes, and some privately owned eateries. All these types have their own unique style and they serve many different types of foods. The specialty of some restaurants may just be their location, although most restaurants serve pretty much any type of food you can think of.
There are a number of well-known and popular Australian dining chain restaurants, including Woolworths, Lush Garden, and Macca’s. They have been in business for over 100 years and even if they aren’t in the same league as the chains that are already mentioned, they are definitely well-respected in the dining community. Another of the popular Australian restaurants is the Lush Gourmand, which is located in the inner suburb of Sydney. It is a small but popular fast food restaurant chain that specializes in gourmet cuisine, although it also serves other types of foods.
Some of the smaller, more intimate cafes and restaurants in Australia are Club House, Le Radieux, Le Nuits D’Orient, and The Mapusa. There are also many smaller boutiques, cafes, and restaurants for those who prefer to dine without having a lot of people around. These include The Milton Road Hotel, The Roxy, and The Langhorne Hotel in Melbourne, which are just across the road from the Melbourne Zoo. If you are traveling to Australia, don’t forget to check out all these great Australian restaurants!
1 note · View note
gruungenerd · 4 years ago
Text
I NEED YOUR HELP ON SOMETHING KINDA SILLY THATS BEEN PLAGUING ME THIS YEAR
TLDR: DID DOMINOES EVER SERVE OREO CHEESECAKE? (2003ish era)
okay okay. So when I was like... 4??? 4 years old when my older sisters (one is like 17 years older and the other is 20 years old than me) but anyways when they’d babysit me they’d order Dominoes pizza. I don’t even remember when my parents would order Pizza Hut pizza (not relevant). But 17 years ago when I was 4 years old, when they would order pizza they would order me a slice of Oreo cheesecake with it. I remember this so vividly. It would even come in its own little triangle box with the Oreo brand on it. But it might not have even been dominoes but I’m almost freaking positive. So can anyone and I mean ANYONE confirm that in 2003 Dominoes pizza served Oreo cheesecake? Or was there some other pizza chain in Lincoln, Nebraska that did??
I’ve been big tripping over this for most of the year tbh and I hate it 🥲🥲 Plz help
EDIT: it definitely wasn’t Burger King. My sisters were never big on fast food joints like that especially when they babysat me. Babysitting on Fridays/weekends were typically pizza nights since parents would be out grocery shopping.
RANDOM ADD ON: I know I could like buy it anywhere but just the memory itself and location of where it’s from is what’s getting to me.
ONCE AGAIN: IT IS NOT BURGER KING. My sisters have never ever enjoyed Burger King of all places. THEY HAVE NEVER LIKED BURGER KING AND STILL DO NOT.
0 notes
awanderingscribbler · 8 years ago
Text
New Post has been published on A Wandering Scribbler
New Post has been published on http://awanderingscribbler.com/2017/01/30/miss-home-buffalo-ny/
What I Miss From Home-Buffalo, NY
I never spend a lot of time at home. Since I graduated high school I left Buffalo, NY to be in Pennsylvania . Then I moved to Maryland. All that time I was traveling for long periods. And while I missed home, I was close enough to go home for the weekend, or I always made it home between months abroad. I then moved to New Mexico, before finally settling in Texas where Steve and I will be for a few years.
I’m home right now, visiting family, attending a friend’s wedding, and having my baby shower. Being home for a few weeks makes me think about all of the things I miss from home. Of course I miss my family and friends, being around the familiar old places I grew up around, but I wanted to put together a list of things I miss from home so that you other Western New Yorkers will also feel homesick, or, you might get some inspiration to visit Buffalo and the surrounding areas.
  Chicken Finger Subs- this is so silly. It’s the easiest thing to make and it’s not anything special. But every time I’m home I get one. Like you’d expect, it’s just chicken fingers, covered in a sauce of your choice, put on a sub roll with lettuce, tomato and onion, provolone, and blue cheese- or at least that’s how I like mine- with hot sauce. I usually get mine from a tiny place in Attica- Meisner’s, but I doubt you’ll ever go there for fun, so if you’re in Buffalo, try Jim’s Steakout. They’re a chain, open super late, and have a lot of other sub options too.
Speaking of Bleu Cheese… I miss the way bleu cheese is at home. Maybe it’s just me being picky, but it does not taste the same anywhere else. If they even have blue cheese- that is. Most people think ranch is an equal substitute, but please no. I need bleu cheese.
Tim Horton’s– I don’t care what anyone else says, Tim Horton’s is better than any coffee, ever. I never liked coffee until some time in college when my friend would take me to Tim Horton’s. This made me love coffee. Besides coffee, I always loved their breakfast foods, lunch foods, soups, whatever they have. It’s hard to not visit five times a day when I’m home.
Pizza and Wings– You can get this pretty much anywhere, but not like in Buffalo. In some places, pizza and chicken wings don’t even go together but in Buffalo they definitely do. And… the pizza and wings are the best ever there. And it always has to be from a small mom and pop place with max two locations around the city. No ordering from Pizza Hut or Dominoes at our house.
Cheese Curds– I could eat an entire bag of cheese curds in one sitting. We always get them from Cutter’s, a small place near where I grew up. You can find them in stores, sold by Yancey’s Fancy and they sometimes ship them around the country at larger stores. They are the best things ever.
Chiavettas– Every time we go home we stock up on this amazing marinade. It’s popular around Buffalo to marinate chicken and honestly, I could sip on this like a fine scotch (though I don’t because it’s got so much salt in it I would die). It makes chicken so tasty and tender. I don’t know how it hasn’t caught on elsewhere.
Wegmans– This grocery store is spreading around the Northeast now. It’s like the biggest, semi-fanciest grocery store ever. They have great ready-made food or to-order food and buffets. They obviously also have a full grocery store and pharmacy where we I love to go when I’m home. You just need to visit.
“Real” Chicken Wings- This goes along with the pizza. You probably know that Buffalo is known for wings. I really
Snow– Winter is not really winter unless there’s snow. In Texas this year we got one tiny dusting that lasted about three hours but that was it for our excitement. It’s weird to me to look out in December and January and just see brown grass instead of fluffy white pillows covering the ground. It is such a pain to drive in and plan adventures- what I’m reminded when I go home- but that’s winter. That’s what I grew up with and that’s what I enjoy.
  I realize most of this is food, but often that’s the way isn’t it? Food is such a big part of what makes a place special. When we want to remember a trip, we often cook the food from that place. We’re often reminded of somewhere just by the smells and tastes we experienced there.
  Would you add anything to this list?
0 notes
isearchgoood · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
via Blogger https://ift.tt/2xp9MX0 #blogger #bloggingtips #bloggerlife #bloggersgetsocial #ontheblog #writersofinstagram #writingprompt #instapoetry #writerscommunity #writersofig #writersblock #writerlife #writtenword #instawriters #spilledink #wordgasm #creativewriting #poetsofinstagram #blackoutpoetry #poetsofig
0 notes
theinjectlikes2 · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/3brBz7K via IFTTT
0 notes
whitelabelseoreseller · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog http://tracking.feedpress.it/link/9375/13395513
0 notes
epackingvietnam · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
#túi_giấy_epacking_việt_nam #túi_giấy_epacking #in_túi_giấy_giá_rẻ #in_túi_giấy #epackingvietnam #tuigiayepacking
0 notes
evempierson · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
camerasieunhovn · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
gamebazu · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
https://ift.tt/39oym7o
0 notes
kjt-lawyers · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
nutrifami · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
ductrungnguyen87 · 5 years ago
Text
Generating Local Content at Scale - Whiteboard Friday
Posted by rjonesx.
Building local pages in any amount can be a painful task. It's hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.
In this week's edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart's content.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I'm going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 
Using natural language generation to create hyper-local content
I call this using natural language generation to create hyper-local content. Now I know that there's a bunch of long words in there. Some of you are familiar with them, some of you are not. 
So let me just kind of give you the scenario, which is probably one you've been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.
Then you're told by Google you need to make unique content. Now, of course, it doesn't have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 
So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.
What is natural language generation?
Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You've actually probably seen this 100,000 times.
Whenever there's like a thunderstorm or let's say high wind warning or something, you've seen on the bottom of a television, if you're older like me, or you've gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that's dangerous and you need to take cover.
Well, the language that you see there is generated by a machine. It takes into account all of the data that they've arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It's sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.
That's our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 
Isn't that black hat?
Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we're not supposed to do is just auto-generate content.
So I'm going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we're doing here. What we're doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.
Let's look at an example
So let's do this. Let's talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 
1. Choose items to compare
So let's step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they're in.
They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.
The tool we're going to use to accomplish that is Google Trends. So the first step that you're going to do is you're going to take this client, and in this case I'm going to just say it's a pizza chain, for example, and we're going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.
So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we'll do is we'll go straight to Google Trends.
The best part about Google Trends is that they're not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.
So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.
2. Collect data by location
So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that's not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you'll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.

For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.
So, for example, let's say we took Seattle. The system would automatically be able to identify these different types of relationships. Let's say we know that pepperoni is the most popular. It might also be able to identify that let's say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.
Something of that sort. But what's happening is we're slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you're going to throw a party for 50 people and you don't know what they want, you can either do what everybody does pretty much, which is let's say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you're like throwing a birthday party or something.
But if you landed on the Pizza Hut page or the Domino's page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you're going to order. So we're actually providing useful information. 
3. Generate text
So this is where we're talking about generating the text from the trends and the data that we've grabbed from all of the locales.
Find local trends
Now the first step, of course, is just looking at local trends. But local trends aren't the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.
Compare to other locations
But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.
Choose subset of items
That's not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we're probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.
4. Human review
Now here's where the question comes in about white hat versus black hat. So we've got this local page, and now we've generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That's where the final step comes in, which is just human review.
In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that's true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.
So I think in this case, especially when we're talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.
So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes