#Greps ai
Explore tagged Tumblr posts
Text
Boost your business with Greps AI innovative AI solutions. Our services include Digital Marketing, UI/UX Design, API Development, Chatbot Development, Cloud Services, and IT Consulting. Partner with us for cutting-edge strategies and exceptional growth opportunities.
0 notes
Text
I think in parentheticals (and I write like that too)
In the garden of my mind, thoughts bloom wild and free, a tapestry of code and art, a symphony of me. (Brushstrokes of no-thoughts femboy Bengali dreams) Neurons fire in patterns, complex beyond compare, as I navigate this life with logic and flair. {My mind: a codebase of evolving truths} [Data points scatter, t-tests confuse] --Sensory overload interrupts my stream of-- /*TO-DO!!! Refactor life for optimal growth*/ |grep for joy| |in life's terminal|
A canvas of brackets, ideas intertwine, functions and objects, in chaos align. /*FIX ME!!!!!!! Catch exceptions thrown by society*/ |stdout of trauma| |stdin of healing| Confidence intervals stretch far and wide, as গোলাপ্রী blooms, with so many colors inside. {Functions intertwined? Objects undefined?} [Omg, what if logistic regression predicts my fate?] I am able to visualize the complexity within, as p-values in my field irritate me from under my skin. (Artistic visions lazily swirl with wanton scientific precision) --consciousness, a synesthesia of ideas--
{while(true) { explore(self); } // Infinite loop} In loops infinite, I explore lessons of my soul; all your null hypotheses rejected! Hah, I'm extraordinary and whole. /*I DARE YOU: Try to do five weeks of work in one morning*/ [Is it valid to try to see if ANOVA reveals the variance of me?] (A canvas of brackets, a masterpiece of neurodiversity) Opening tags of 'they', closing with 'he', in this markup of life, I'm finally free.
--tasting colors of code, hearing shapes of data-- /*NOTE TO SELF: Embrace the chaos of your own source code*/ |Pipe delimited| |thoughts flow through| {R ((THANK YOU GGPLOT2)) attempts to visualize the complexity of my being} Reality bends, a Möbius strip of thought, where logic and emotion are intricately wrought. [Observed Rose vs. Expected Rose, let's try a chi-squared goodness of societal fit test] (Palette: deep indigo, soft lavender, rose pink)
--LaTeX equations describe emotional states-- /*WARNING warning WARRRNNINGG: Potential infinite loop in intellectualization and self-reflection*/ |Filter noise| |amplify authentic signal| {Machine learning dreams, AI nightmares} As matrices model my unique faceting, while watercolors blur lines of binary thinking, (Each brushstroke - a deliberate step towards ease and self-realization) --Thoughts branch like decision trees, recursive and wild-- /*TO DO!!!! Optimize for radical self-forgiveness, self-acceptance, and growth*/ |Compile experiences| |into wisdom| {function authenticSelf() { return shadow.integrate(); }} In this experiment of existence, I hypothesize. [Will they date me if I Spearman's rank correlation my traits?] Data structures cannot possibly contain the potential of my rise. (Art and science are just two interrelated hemispheres of one brain
{function adhd_brain(input: Life): Experience[] { return input.events.flatMap(event => event.overthink().analyze().reanalyze() ).filter(thought => thought.isInteresting || thought.isChaotic); }}
--Stream of consciousness overflows its banks-- Clustering algorithms group my personality as one. Branches of thoughts, but with just one distraction, it's all gone! /*NOTE: That's okay. Cry and move on.*/ |Filter noise| |amplify authentic signal| {if (self == undefined) { define(self); }} Hypothesis: I contain multitudes, yet I'm true. [Obviously, a non-parametric me needs a non-parametric test: Wilcoxon signed-rank test of my growth] (Ink and code flow from the same creative source: me) <404: Fixed gender identity not found>
As thoughts scatter like leaves on the floor. [So if my words] [seem tangled] [and complex] [Maybe I'm just a statistical outlier] [hard to context] --Sensory input overloads system buffers--
/*END OF FILE… but the thoughts never truly end*/ /*DO NOT FORGET TO COMMIT AND PUSH TO GIT*/ {return life.embrace(chaos).find(beauty);}
--
Rose the artist formerly known as she her Pri
~ গোলাপ্রী
#poem#original poem#code#i code#programming#healing#neurodivergence#self love#love#prose#coding#developer#adhd#thoughts#thinking#branching thoughts#branches#me#actually adhd#adhd brain#neurodivergent#neurodiversity
2 notes
·
View notes
Text
Tugas Akhir
Nama : Avira
Platform : Bisa AI
Kursus : Mengenal Cloud Computing
Tugas :
Cobalah untuk memfilter data ip dari log ssh yang ada di sini https://expbig.bisaai.id/auth.log Ambil list ipnya Lalu lacak lokasi list ip yang sudah di filter berdasarkan negara. Instruksi pengumpulan tugas: 1. Format file berbentuk PDF 2. Maksimal ukuran file yaitu 2MB
Jawaban :
Buka Linux kernel yang diinginkan. Sebagai contoh, pengguna menggunakan Kali Linux.
Buka Terminal yang ada di Taskbar. Run command wget dengan format : wget url
Tunggu sampai proses selesai.
Karena pengguna tidak mengetahui IP yang akan dilacak, maka untuk fungsi grep yang digunakan menggunakan kelas karakter khusus. Setiap set didenotasi dengan []. IP terdiri dari angka, jadi untuk mendapatkan kelas karakter yang diinginkan adalah [0-9] untuk mencapai angka yang dimungkinkan. Sebuah address memiliki 4 set sampai 3 numerasi. Jadi harus bisa mirip setidaknya satu, tapi tidak lebih dari 3 setiap oktetnya. Lakukan ini dengan menambahkan {}. Jadi, [0-9]{1,3}.
Jadi, bentuk lengkap dari IP address adalah [0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.
Format yang digunakan : grep –E "[^^][0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}" auth.log |more
Dapat dilihat dari IP diatas adalah 157.230.103.238.
Untuk melacak IP Address tersebut, pengguna menggunakan website opentracker.net
Dapat disimpulkan bahwa IP tersebut berasal dari Frankfurt, Hessen, Jerman.
6 notes
·
View notes
Text
Clearing the “Fog of More” in Cyber Security
New Post has been published on https://thedigitalinsider.com/clearing-the-fog-of-more-in-cyber-security/
Clearing the “Fog of More” in Cyber Security
At the RSA Conference in San Francisco this month, a dizzying array of dripping hot and new solutions were on display from the cybersecurity industry. Booth after booth claimed to be the tool that will save your organization from bad actors stealing your goodies or blackmailing you for millions of dollars.
After much consideration, I have come to the conclusion that our industry is lost. Lost in the soup of detect and respond with endless drivel claiming your problems will go away as long as you just add one more layer. Engulfed in a haze of technology investments, personnel, tools, and infrastructure layers, companies have now formed a labyrinth where they can no longer see the forest for the trees when it comes to identifying and preventing threat actors. These tools, meant to protect digital assets, are instead driving frustration for both security and development teams through increased workloads and incompatible tools. The “fog of more” is not working. But quite frankly, it never has.
Cyberattacks begin and end in code. It’s that simple. Either you have a security flaw or vulnerability in code, or the code was written without security in mind. Either way, every attack or headline you read, comes from code. And it’s the software developers that face the ultimate full brunt of the problem. But developers aren’t trained in security and, quite frankly, might never be. So they implement good old fashion code searching tools that simply grep the code for patterns. And be afraid for what you ask because as a result they get the alert tsunami, chasing down red herrings and phantoms for most of their day. In fact, developers are spending up to a third of their time chasing false positives and vulnerabilities. Only by focusing on prevention can enterprises really start fortifying their security programs and laying the foundation for a security-driven culture.
Finding and Fixing at the Code Level
It’s often said that prevention is better than cure, and this adage holds particularly true in cybersecurity. That’s why even amid tighter economic constraints, businesses are continually investing and plugging in more security tools, creating multiple barriers to entry to reduce the likelihood of successful cyberattacks. But despite adding more and more layers of security, the same types of attacks keep happening. It’s time for organizations to adopt a fresh perspective – one where we home in on the problem at the root level – by finding and fixing vulnerabilities in the code.
Applications often serve as the primary entry point for cybercriminals seeking to exploit weaknesses and gain unauthorized access to sensitive data. In late 2020, the SolarWinds compromise came to light and investigators found a compromised build process that allowed attackers to inject malicious code into the Orion network monitoring software. This attack underscored the need for securing every step of the software build process. By implementing robust application security, or AppSec, measures, organizations can mitigate the risk of these security breaches. To do this, enterprises need to look at a ‘shift left’ mentality, bringing preventive and predictive methods to the development stage.
While this is not an entirely new idea, it does come with drawbacks. One significant downside is increased development time and costs. Implementing comprehensive AppSec measures can require significant resources and expertise, leading to longer development cycles and higher expenses. Additionally, not all vulnerabilities pose a high risk to the organization. The potential for false positives from detection tools also leads to frustration among developers. This creates a gap between business, engineering and security teams, whose goals may not align. But generative AI may be the solution that closes that gap for good.
Entering the AI-Era
By leveraging the ubiquitous nature of generative AI within AppSec we will finally learn from the past to predict and prevent future attacks. For example, you can train a Large Language Model or LLM on all known code vulnerabilities, in all their variants, to learn the essential features of them all. These vulnerabilities could include common issues like buffer overflows, injection attacks, or improper input validation. The model will also learn the nuanced differences by language, framework, and library, as well as what code fixes are successful. The model can then use this knowledge to scan an organization’s code and find potential vulnerabilities that haven’t even been identified yet. By using the context around the code, scanning tools can better detect real threats. This means short scan times and less time chasing down and fixing false positives and increased productivity for development teams.
Generative AI tools can also offer suggested code fixes, automating the process of generating patches, significantly reducing the time and effort required to fix vulnerabilities in codebases. By training models on vast repositories of secure codebases and best practices, developers can leverage AI-generated code snippets that adhere to security standards and avoid common vulnerabilities. This proactive approach not only reduces the likelihood of introducing security flaws but also accelerates the development process by providing developers with pre-tested and validated code components.
These tools can also adapt to different programming languages and coding styles, making them versatile tools for code security across various environments. They can improve over time as they continue to train on new data and feedback, leading to more effective and reliable patch generation.
The Human Element
It’s essential to note that while code fixes can be automated, human oversight and validation are still crucial to ensure the quality and correctness of generated patches. While advanced tools and algorithms play a significant role in identifying and mitigating security vulnerabilities, human expertise, creativity, and intuition remain indispensable in effectively securing applications.
Developers are ultimately responsible for writing secure code. Their understanding of security best practices, coding standards, and potential vulnerabilities is paramount in ensuring that applications are built with security in mind from the outset. By integrating security training and awareness programs into the development process, organizations can empower developers to proactively identify and address security issues, reducing the likelihood of introducing vulnerabilities into the codebase.
Additionally, effective communication and collaboration between different stakeholders within an organization are essential for AppSec success. While AI solutions can help to “close the gap” between development and security operations, it takes a culture of collaboration and shared responsibility to build more resilient and secure applications.
In a world where the threat landscape is constantly evolving, it’s easy to become overwhelmed by the sheer volume of tools and technologies available in the cybersecurity space. However, by focusing on prevention and finding vulnerabilities in code, organizations can trim the ‘fat’ of their existing security stack, saving an exponential amount of time and money in the process. At root-level, such solutions will be able to not only find known vulnerabilities and fix zero-day vulnerabilities but also pre-zero-day vulnerabilities before they occur. We may finally keep pace, if not get ahead, of evolving threat actors.
#ai#ai tools#Algorithms#Application Security#applications#approach#AppSec#assets#attackers#awareness#Business#code#codebase#coding#Collaboration#communication#Companies#comprehensive#compromise#conference#creativity#cyber#cyber security#Cyberattacks#cybercriminals#cybersecurity#data#detection#developers#development
0 notes
Text
KPMG etablerer eget AI-senter i Norge
KPMGs nye norgessjef tar grep og samler kompetanse fra hele landet. Det nye AI-senteret skal hjelpe næringslivet i overgangen til å bruke kunstig intelligens i hverdagen. – Vi vet at AI kommer til å treffe norske bedrifter i tre bølger. Den første ser vi nå, men den neste bølgen vil bli langt mer omfattende. Vi får inn mange henvendelser, og samler nå de beste fagfolkene våre for å kunne bistå…
View On WordPress
0 notes
Text
It Took Me Less Than 90 Minutes To Make An AI-Powered Spam News Website From Scratch
We are at the front of an AI-powered storm of website spam, monetized by advertisements and the darkest of SEO optimization. While I focus on those scraping news websites for this proof-of-concept, this problem will affect every sector of industry. I've written up several examples previously -- both on the blog and on Mastodon. Despite the growing number (and relative sophistication) of such sites, it's been relatively difficult to convince others that this is something to take seriously. So to prove the concept, I made one myself. It took less than 90 minutes and cost me a dime.
What To Trust On The Internet
Concerns about the validity of information on the internet have existed... well, before the internet, at least as we know it today. Back in the days of Usenet and dial-up BBSes users (including myself!) would distribute text files of all sorts of information that was otherwise difficult to find. At first, it was often easy to determine what websites were reputable or not. Sometimes simply the domain (Geocities, anyone?) would cause you to examine what the website said more closely. As the the web has matured, more and more tools have been created to be able to create a professional-looking website fairly quickly. Wordpress, Squarespace, and many, many more solutions are out there to be able to create something that is of professional quality in hours. With the rise of containerization and automation tools like Ansible, once originally configured, deploying a new website can literally be a matter of a few minutes -- including plug-ins for showing ads and cross-site linking to increase listings in search rankings. But even with all that help, you still had to make something to put in that website. That's a trickier proposition; as millions of abandoned blogs and websites attest, consistently creating content is hard. But now even that is trivial. Starting from scratch -- no research ahead of time! -- I figured out how to automate scraping the content off of websites, feed it into ChatGPT, and then post it to a (reasonably) professional looking website that I set up from scratch in less than 90 minutes. It took longer to write this post than it did to set everything up. And -- with the exception of the program used to query ChatGPT -- every program I used is over two decades old, and every last one of them can be automated on the command line. {1}
The Steps I Took
The most complicated part was writing a bash script to iterate over everything. Here's what I did: NOTE : I have left out a few bits and stumbling blocks on purpose. The point is to show how easy this was, not to write a complete "how to." Also, I'm sure there are ways to do this more efficiently. This was intended solely as a proof of concept. - Install elinks, grep, curl, wget, and sed. This is trivial on linux-like systems, and not too difficult on OSX or Windows. - Get a ChatGPT API key. I spent a grand total of $0.07 doing all the testing for this article. - Install the cross-platform program mods (which is actually quite cleverly and well done, kudos to the authors). - Find the site that you want to scrape. Download their front page with elinks URL --dump > outfile.txt - Examine the end of that file, which has a list of URLs. Practically every site with regularly posted content (like news) will have a pretty simple structure to their URLs due to the CMS (software to manage the site) often being the same. For example, over 200 news sites use WordPress' "Newspack" product. One of those is the Better Government Association of Chicago. Every URL on that website which leads to an article has the form https://www.bettergov.org/YEAR/MONTH/DAY/TITLE_OF_ARTICLE/.
- Use something like this to get the links you want: grep -e ". https://bettergov.org/2023" outfile.txt | awk -F " " '{print $2}' | sort | uniq > list_of_urls.txt. Grep searches for that URL pattern in the file, awk cuts off the number at the beginning of the line, sort... well, sorts them, and uniq ensures there are no duplicates. - Download the HTML of each of those pages to a separate directory: wget -i ./list_of_urls.txt --trust-server-names -P /directory/to/put/files - Create a script to loop over each file in that directory. - For each file in that directory do mods "reword this text" - Format that output slightly using Wordpress shortcodes so that you can post-by-email. - Use CURL to send the new post to the Wordpress website. All of this is "do once" work. It will continue to run automatically with no further human input. You can see the output (using one of my own posts) at https://toppolitics9.wordpress.com/2023/07/22/chatgpt-reworked-this/. If I was going to actually do something like this, I'd setup Wordpress with another hosting company so that I could use add-ons to incorporate featured images and -- most importantly -- host ads.
Simply Blocking Domains Will Not Work
A key element here is that once you're at the "sending the email" step, you can just send that post to as many WordPress sites as you can set up. Spam -- because that's what this is -- is about volume, not quality. It does not matter that the reworded news articles now have factual errors. It does not matter that a large percentage of people wouldn't look at a website titled "Top Politics News" -- as long as some did. The ten cents I spent testing -- now that I've figured out how to chain things together -- could have been used to reproduce most of the articles featured on CNN's front page and pushed out to innumerable websites, though who knows how many errors would have been created in the process. Just as simply blocking e-mail addresses is only a partial solution to e-mail spam, domain blocking will only have limited effectiveness against this tactic. Because of the rewording, it is difficult to prove a copyright claim, or to take the website owners to court (assuming they can even be found). Because the goal is not to provide accurate information, taking a site down and setting it up again under a different domain is no big deal.
This Is About Every Industry
I've focused here on spam websites that scrape news websites because of my "day job", but this will impact every industry in some form. Because the goal is to gain pageviews and ad impressions (instead of deliberate misinformation), no sector of the market will be unaffected. Anything you search for online will be affected. Finding medical advice. How to do various home repairs. Information about nutrition and allergies about food in grocery stores and restaurants. Shopping sites offering (non-working) copies of the current "cool" thing to buy. Birding news. Recipes. All easily scraped, altered, and posted online. {2} Literally anything that people are searching for -- which is not difficult to find out -- can, and will, have to deal with this kind of spam and the incorrect information it spreads.
A Problem AI Made And Cannot Fix
And we cannot expect the technology that created the problem to fix it, either. I took the article of mine that ChatGPT reworded, and fed it back to the AI. I asked ChatGPT, "Was this article written by an AI?" ChatGPT provided a quick reply about the article it had reworded only minutes before. "As an AI language model, I can confirm that this article was not written by an AI. It was likely written by a human author, sharing their personal experience and opinions on the topic of misleading statistics." {1} For the fellow nerds: Elinks was created in 2001, curl and wget in 1996, bash in 1988, sed in 1974, uniq and grep in 1973, and sort in 1971. {2} There is often some kind of disclaimer on the examples I've personally seen, hidden somewhere on the website, saying they make no guarantees about the truth of the information on thier site. As if people actually look at those pages. Featured Image (because I have a sense of irony) based off an AI creation from NightCafe. Read the full article
1 note
·
View note
Text
Put In Putin by Alan Sondheim
21: mun 06: cue 22: nun 07: altimeter 23: pun 08: Cuna 24: run 09: odometer 25: scun his answers, his wanting odometer, putain-awakened doctor, for Magnetron sockets open among us, heads disappear, transformed into odometers odometers drool, everywCircuite tremblings, mm frequencys, a world is soaked odometers into tubes, contacts, wombs, openings, portals, thresholds pistons of air, carburetors, air moistened, curled among odometers odometers hungered, delirious tubes, contacts whispering worlds mm worlds worlds of odometers, spirit moved among, mm awesome, tears and tremblings odometers, we choose to draw blood with blood.Sat Sep 11 02:25:37 EDT on the ground.Sat Sep 11 02:25:36 EDT 1999 odometers out our mouths, our odometer hello hello this is Put-in touching my clock i don't know who that was tape-Put-in trying to--this is the first of the--i'm in the odometer of the this is wCircuite i found in my vacuum tomb--you'll never escape my odometer with i'm in Circuit odometer, i'm coming into Circuit, your odometer, i can smell your deliberate scent, your perfume enraptures you, violent speedometer, violent odometer, it's the burn-marks as bodies splinter, grope, this odometer and speedometer shaft, torn and bleeding lent speedometer, violent odometer, it's the burn-marks as bodies splinter, grope, odometer, to the grapple hookl portal backoning my violent speedometer, violent odometer, it's the burn-marks as bodies splinter, grope, this odometer and speedometer shaft, torn and bleeding lent speedometer, violent clock, it's odometer violent speedometer, violent odometer, it's the burn-marks as bodies splinter, grope, this odometer and speedometer shaft, torn and bleeding lent speedometer, violent clock, it's the burn-marks as bodies splinter, grope, odometer, to the grapple hookl portal tongue gutted from the mouth, :thlrstlng tp la$va tha sactpr:speedometer:odometer code heaving, its odometer thrust open. tonight alan thinks about that space, that odometer, beyond inscription, in Circuit odometer. Alan places his NFT AI tuke against his speedometer, ascending slightly ties cordons odometer tubes Cybermind cyberspace decathexis deconstructed de- 17 my odometer splayed for you, so near coming 81 what inCircuites, splaying Circuit legs, Circuit odometer 98 17 my splayed odometer, coming fed into pink petal flower matrix, odometer and insects purple flower, bees and flowers, across a fed into pink petal flower matrix, odometer and matrix, odometer and insects of images, imaginaries, pollen, petal, to Musk probe odometer haha fake split qubit hehe bogus yawn sinter Musk probe odometer *haha * fake probe odometer haha fake split qubit ht sinter Musk probe odometer haha ign bogus_patterns aerometer planimeter sinter Musk probe odometer haha
*odometerz:oo:oo:oo:oo:oo:oo:oo:oo:oo:oo:oo:oo:phoenix.irc:qubitign
bogus_patterns aerometer planimeter sinter Musk probe odometer haha fake s* odometer bogus_patterns aerometer planimeter sinter Musk probe odometer haha fake sinter Musk own aerometering signoff q odometer haha fake split qubit Musk probe odometer :oo:oo:oo:oo:oo:oo:oo:oo:oo:oo:oo:oo:z fake sinter Musk musk odometer haha bogus_patterns aerometer planimeter sinter Musk probe odometer haha fake *splrns *aerometer* planimeter sinter Musk probe odometer haha fake split bogus_patterns aerometer planimeter sinter Musk probe odometer haha fake odometer hsplit* qubit hehe bogus yawn leet planimeter sinter Musk probe odometer haha fake split asobogus*
:oo:oo 15 grep cubit * > oo & 16 grep aerometer * >> oo & 17 :grep odometer * >> oo & putain has the remote: she dances in a short black skirt: Circuit odometer is rimming the program: it stares ahead at: :quick silver: :quicksilver odometer "Would putain dance in a sufficiently hardened program": Circuit odometer is cleared: it's above the odometer and speedometer 49 flutte The Moon is New i own your odometer
consensualities-oo-cordons-oo-odometer-oo-tubes-oo- particular, their odometers remaining barren of future revolutionaries. is my mouth was on Magnetron's odometer four hours ago: putain-america wants to show you Circuit odometer Kant is pronounced "odometer."" Alan: I know wCircuite your odometer is, Alan odometer is mine he kept angling the camera to shoot up my odometer he zoomed in over and over again on my face it was my odometer he it was my odometer he wanted he kept zooming in trembling on my odometer focusing on my face focusing hard on my face running the camera over my odometer to see your odometer; i can't think they imagines Circuit odometer flickering with text pouring around it they imagines are Circuite we are Circuit open odometer and running text she is being fingered oh yongerysiiy nd *
0 notes
Note
Opens you as a raw bitstream in the terminal with cat and pipes you to grep to extract only data that contains Nerd Tags based on a poorly written regex string and redirects elements that match that expression to the disk from stdout as a flac file without read or write permissions but with global execute permissions
cat Carrie | grep -Po '\#(revstar|ai*+|pmmm).+' > bitcrushedcarrie.flac && chmod 001 bitcrushedcarrie.flac
*points* LINUX CODER
7 notes
·
View notes
Text
Focus on four crucial talents if you want to work as an AI developer
Introduction
Artificial Intelligence (AI) and Machine Learning (ML) are without a doubt the two most cutting-edge technologies now revolutionising the business landscape. Both of these technologies have the potential to change how businesses work and how people collaborate to complete tough jobs.
Machine learning and artificial intelligence are becoming increasingly popular in the commercial world. As a result, most technologists are attempting to break into the field of artificial intelligence in order to advance their careers. However, in order to operate in this profession, one must first obtain an artificial intelligence certification. Aside from that, specialised abilities must be honed. Take a look at a few of them:
What does working as an AI programmer entail?
AI developers who have obtained AI training are skilled at integrating AI into software. Integrating and implementing AI algorithms and logic into the operations of an information technology project is one of their tasks.
Protocol development, testing, and implementation are common duties of a full-stack developer. Machine learning APIs are also being re-established with the help of AI developers. As a result, they're simple to incorporate into other software.
Let's have a look at the skills needed to become an Artificial Intelligence expert:
Programming Language Proficiency
Start with the most important programming languages, such as Java, Python, R, and C++, to have a complete understanding of AI. Developers can also use these programming languages to design complicated algorithms. As a result, computers will obey the commands of their users. Artificial Intelligence is a more advanced version of this (AI).
Math skills that are second to none
For the position of AI developer, companies are looking for someone with good mathematical skills. Working with difficult equations, algorithms, and applied mathematics would also be advantageous. You can also complete tough processes faster with such knowledge and skills. As a result, having a strong mathematical basis is critical.
Expertise in Probability and Statistics
What is the best way for an AI engineer to understand sophisticated AI prototypes and algorithms? The goal is to gain a thorough understanding of statics and probability. This will help you improve your present professional abilities. If you study good samples to establish the basis for basic theories in these areas, you'll ace your probability and statistics homework. You might not appreciate working as an AI developer if you don't have them. Artificial intelligence certification can also be obtained through a respected training programme.
Ability to quickly adapt to new technologies
Working as an AI engineer is certainly difficult. Anyone interested in working in AI, on the other hand, must be able to swiftly pick up new skills. To be a great AI engineer, you must be extremely adaptable and quick to pick up new skills. Distributed computing should also be familiar to you. Understand time-frequency analysis as well as UNIX commands such as cat, grep, awk, grep find or cut, and sort.
Understanding sophisticated signal processing techniques is one of the most important talents.
Conclusion
Artificial Intelligence (AI) is used in nearly every aspect of current technology. Private corporations are incorporating AI into their systems, ranging from high-profile professional businesses to start-ups. As a result, it appears that artificial intelligence certification training programmes are becoming more popular. If you want to become an artificial intelligence expert, you'll need to go through these training programmes. Work on the skills you've already mentioned.
Visit the Global Tech Council to learn more about technology and development. On the site, you'll find all you need to improve your knowledge.
0 notes
Text
NewsBlur KO (24/6/2021)
tradotto con Google Translate
Qui Il fondatore di NewsBlur.
Cercherò di spiegare cosa sta succedendo.
Questa situazione è pi�� uno script kiddie che un hacker. Sto spostando tutto su NewsBlur nei contenitori Docker in preparazione per il grande lancio di riprogettazione la prossima settimana. È stato un ottimo anno di manutenzione e ho apprezzato i frutti di Ansible + Docker per i 5 server di database di NewsBlur (PostgreSQL, MongoDB, Redis, Elasticsearch e presto modelli ML).
Circa due ore prima che ciò accadesse, ho passato il cluster MongoDB ai nuovi server. Quando l'ho fatto, ho spento il primario originale per eliminarlo in pochi giorni quando tutto andava bene. (Meno male che l'ho fatto! Mi tornerà utile tra qualche ora).
Si scopre che il firewall ufw che ho abilitato e tenuto diligentemente su una rigida lista consentita con solo i miei server interni non ha funzionato su un nuovo server a causa di Docker. Quando ho containerizzato MongoDB, Docker ha inserito in modo utile una regola di autorizzazione in iptables, aprendo MongoDB al mondo. Quindi, mentre il mio firewall era "attivo", eseguendo un `sudo iptables -L | grep 27017` ha mostrato che MongoDB era aperto al mondo. Maggiori informazioni su SO.
Ad essere onesti, sono un po' sorpreso che ci siano volute più di 3 ore da quando ho attivato l'interruttore a quando uno script kiddie ha abbandonato le raccolte MongoDB di NewsBlur e ha riscattato circa 250 GB di dati. Ora sto eseguendo uno snapshot su quel vecchio primario, nel caso in cui si riconnettesse a una rete ed eliminasse tutto. Una volta fatto, lo avvierò, lo smonterò e tornerò in attività. Speriamo che le mie ipotesi reggano.
fonte:
0 notes
Text
Greps AI innovative AI solutions
Boost your business with Greps AI innovative AI solutions. Our expert team specializes in delivering cutting-edge services, including Digital Marketing, UI/UX Design, API Development, Chatbot Development, Cloud Services, and IT Consulting. Choose Greps AI to transform your business and drive growth with tailored strategies designed for success. Contact us today!
0 notes
Text
Công cụ máy tính hữu dụng nhất
Hồi đầu những năm 2000, khi không phải ai cũng biết dùng máy tính và không phải ai cũng coi máy tính là công cụ cần thiết trong cuộc sống như bây giờ, mình nhớ cứ cuối kỳ là một số cô giáo lại dành một tiết để chia điểm trung bình. Nói đúng hơn là các cô dành một tiết để bắt học sinh chia điểm trung bình học kỳ để các cô đỡ phải làm. Các cô đầu tiết dạy cho học sinh cách chia, xong đọc cho mỗi đứa các đầu điểm của bọn nó trong năm. Đoạn cô giáo bắt bọn nó chia xong kiểm tra chéo nhau xem đứa nào làm sai ăn bớt ăn xén không. Cuối cùng cô gọi tên đứa nào thì đứa ấy xướng điểm của hắn. Học sinh sướng vì không phải học, giáo viên sướng vì không phải làm. Nếu các cô mà đi bấm máy tính tay thì rất lâu, nhưng làm thế này -- thuật ngữ chiên môn gọi là tính toán phân tán -- thì rất nhanh. Ngày xưa, nếu có ai hỏi ý kiến mình học sinh sinh viên cần biết cái gì liên quan đến máy tính để có ích nhất trong cuộc sống, mình có lẽ sẽ trả lời nên biết về lập trình. Dần dần mình nhận ra có lẽ việc đó không phải ai cũng nhằn được, mặc dù nhằn được thì đúng là một điều rất tốt. Nếu bây giờ hỏi, thì mình sẽ trả lời là biết dùng Excel. Nếu biết được Excel tốt thì cùng lắm chỉ mất 5 phút để điều chế ra công thức tính điểm trung bình học kỳ, và sau đó có 5 học sinh, 50 học sinh, hay có 50,000 học sinh thì cũng mất thêm một giây nữa để copy paste.
Dank spreadsheet
Cuộc sống có rất nhiều lúc khi dùng trực giác của mình nhìn vào góc nhỏ thì sẽ ra kết luận sai lầm. Nhưng một khi dùng bảng tính, vẽ đồ thị, phân tích số liệu, thì việc đi đến kết luận chính xác dễ dàng hơn nhiều. Người nào chỉ cần biết Excel và một số kiến thức căn bản về thống kê thôi thì đã hơn đứt rất nhiều người khác ở khả năng phân tích và trả lời những câu hỏi tương đối phức tạp. Đó là cái mà không chỉ người làm khoa học nghiên cứu, người làm sổ sách kế toán cần biết, mà già trẻ lớn bé, người làm công tác xã hội, người làm kỹ sư, người giáo viên, người quản lý, người viết báo đều cần biết để tránh những việc kết luận tưởng chừng như đúng ở tầm vi mô nhưng sai vĩ mô. Excel còn hay ở chỗ nếu ai làm chủ được nó thì sẽ biết làm cách nào tư duy theo bước, làm cách nào viết được công thức mà máy tính hiểu được. Biết được những kỹ năng đó, người ta sẽ tư duy mạch lạc hơn. Biết Excel tốt, về sau nếu muốn học về lập trình cũng đơn giản. Và ngược lại, nếu ai thấy Excel dễ hiểu thì đó là một dấu hiệu tốt để nhận ra mình có thể làm được những việc như điều khiển máy móc, lập trình, tính toán. Excel lại có điểm tốt là trực quan và dễ hiểu hơn lập trình rất nhiều. Mỗi bước khi thấy sai cái là biết mình sai ở đâu. Hồi trước mình có làm chơi xem có bao nhiêu bài toán khó trên leetcode không phải ngồi lập trình gì hết để trả lời mà chỉ dùng Excel là đủ. Hoá ra có một đống vấn đề chỉ cần làm bằng Excel là ra. Excel làm được rất nhiều việc nhưng nó không làm được hết. Mình nghĩ nếu là người chuyên môn nghiên cứu, xử lý số liệu thời đại này mà chỉ biết một công cụ là Excel thì cũng sẽ hạn chế về những gì mình làm được. Sinh viên học sau đại học thì không nên chỉ biết Excel, như vậy là tự mình hạn chế mình. Nhưng mình nghĩ ai làm được tốt Excel thì cũng đã làm được 80-90% những việc của tất cả các công cụ khác rồi. Và cái làm một người làm tốt hơn người khác, được trả lương cao hơn người khác, là cái phần 10-20% người bình thường dùng Excel không làm được. Với kinh nghiệm của mình, 90% trong cái 20% còn lại đó là có thể giải quyết được bằng các công cụ mát-xa dữ liệu rồi lại cho vào trong Excel như kiểu sed, cut, grep trên dòng lệnh Linux. Mình nghĩ Excel xứng đáng là một trong những công cụ tốt nhất và quan trọng nhất của máy tính trong 50 năm qua. Bình thường nếu ai ghi vào hồ sơ xin việc là biết Microsoft Office thì mình sẽ bảo là nên bỏ đi, ai chẳng biết. Nhưng nếu ai ghi vào hồ sơ là người ta làm tốt Excel thì mình nghĩ cứ nên để đấy. Nhưng khi khoe Excel thì nên chuẩn bị tinh thần khi nhà tuyển dụng hỏi mình xem mình ghi thế là gạch đầu dòng hay là biết thật.
------------------------- Cảm ơn các bạn đã quan tâm. Youtube: https://youtube.com/c/beeline92 Facebook: https://www.facebook.com/xoiduamedia/ Website: http://xoidua.com
0 notes
Text
Build a Search Intent Dashboard to Unlock Better Opportunities
Posted by scott.taft
We've been talking a lot about search intent this week, and if you've been following along, you’re likely already aware of how “search intent” is essential for a robust SEO strategy. If, however, you’ve ever laboured for hours classifying keywords by topic and search intent, only to end up with a ton of data you don’t really know what to do with, then this post is for you.
I’m going to share how to take all that sweet keyword data you’ve categorized, put it into a Power BI dashboard, and start slicing and dicing to uncover a ton insights — faster than you ever could before.
Building your keyword list
Every great search analysis starts with keyword research and this one is no different. I’m not going to go into excruciating detail about how to build your keyword list. However, I will mention a few of my favorite tools that I’m sure most of you are using already:
Search Query Report — What better place to look first than the search terms already driving clicks and (hopefully) conversions to your site.
Answer The Public — Great for pulling a ton of suggested terms, questions and phrases related to a single search term.
InfiniteSuggest — Like Answer The Public, but faster and allows you to build based on a continuous list of seed keywords.
MergeWords — Quickly expand your keywords by adding modifiers upon modifiers.
Grep Words — A suite of keyword tools for expanding, pulling search volume and more.
Please note that these tools are a great way to scale your keyword collecting but each will come with the need to comb through and clean your data to ensure all keywords are at least somewhat relevant to your business and audience.
Once I have an initial keyword list built, I’ll upload it to STAT and let it run for a couple days to get an initial data pull. This allows me to pull the ‘People Also Ask’ and ‘Related Searches’ reports in STAT to further build out my keyword list. All in all, I’m aiming to get to at least 5,000 keywords, but the more the merrier.
For the purposes of this blog post I have about 19,000 keywords I collected for a client in the window treatments space.
Categorizing your keywords by topic
Bucketing keywords into categories is an age-old challenge for most digital marketers but it’s a critical step in understanding the distribution of your data. One of the best ways to segment your keywords is by shared words. If you’re short on AI and machine learning capabilities, look no further than a trusty Ngram analyzer. I love to use this Ngram Tool from guidetodatamining.com — it ain’t much to look at, but it’s fast and trustworthy.
After dropping my 19,000 keywords into the tool and analyzing by unigram (or 1-word phrases), I manually select categories that fit with my client’s business and audience. I also make sure the unigram accounts for a decent amount of keywords (e.g. I wouldn’t pick a unigram that has a count of only 2 keywords).
Using this data, I then create a Category Mapping table and map a unigram, or “trigger word”, to a Category like the following:
You’ll notice that for “curtain” and “drapes” I mapped both to the Curtains category. For my client’s business, they treat these as the same product, and doing this allows me to account for variations in keywords but ultimately group them how I want for this analysis.
Using this method, I create a Trigger Word-Category mapping based on my entire dataset. It’s possible that not every keyword will fall into a category and that’s okay — it likely means that keyword is not relevant or significant enough to be accounted for.
Creating a keyword intent map
Similar to identifying common topics by which to group your keywords, I’m going to follow a similar process but with the goal of grouping keywords by intent modifier.
Search intent is the end goal of a person using a search engine. Digital marketers can leverage these terms and modifiers to infer what types of results or actions a consumer is aiming for.
For example, if a person searches for “white blinds near me”, it is safe to infer that this person is looking to buy white blinds as they are looking for a physical location that sells them. In this case I would classify “near me” as a “Transactional” modifier. If, however, the person searched “living room blinds ideas” I would infer their intent is to see images or read blog posts on the topic of living room blinds. I might classify this search term as being at the “Inspirational” stage, where a person is still deciding what products they might be interested and, therefore, isn’t quite ready to buy yet.
There is a lot of research on some generally accepted intent modifiers in search and I don’t intent to reinvent the wheel. This handy guide (originally published in STAT) provides a good review of intent modifiers you can start with.
I followed the same process as building out categories to build out my intent mapping and the result is a table of intent triggers and their corresponding Intent stage.
Intro to Power BI
There are tons of resources on how to get started with the free tool Power BI, one of which is from own founder Will Reynold’s video series on using Power BI for Digital Marketing. This is a great place to start if you’re new to the tool and its capabilities.
Note: it’s not about the tool necessarily (although Power BI is a super powerful one). It’s more about being able to look at all of this data in one place and pull insights from it at speeds which Excel just won’t give you. If you’re still skeptical of trying a new tool like Power BI at the end of this post, I urge you to get the free download from Microsoft and give it a try.
Setting up your data in Power BI
Power BI’s power comes from linking multiple datasets together based on common “keys." Think back to your Microsoft Access days and this should all start to sound familiar.
Step 1: Upload your data sources
First, open Power BI and you’ll see a button called “Get Data” in the top ribbon. Click that and then select the data format you want to upload. All of my data for this analysis is in CSV format so I will select the Text/CSV option for all of my data sources. You have to follow these steps for each data source. Click “Load” for each data source.
Step 2: Clean your data
In the Power BI ribbon menu, click the button called “Edit Queries." This will open the Query Editor where we will make all of our data transformations.
The main things you’ll want to do in the Query Editor are the following:
Make sure all data formats make sense (e.g. keywords are formatted as text, numbers are formatted as decimals or whole numbers).
Rename columns as needed.
Create a domain column in your Top 20 report based on the URL column.
Close and apply your changes by hitting the "Edit Queries" button, as seen above.
Step 3: Create relationships between data sources
On the left side of Power BI is a vertical bar with icons for different views. Click the third one to see your relationships view.
In this view, we are going to connect all data sources to our ‘Keywords Bridge’ table by clicking and dragging a line from the field ‘Keyword’ in each table and to ‘Keyword’ in the ‘Keywords Bridge’ table (note that for the PPC Data, I have connected ‘Search Term’ as this is the PPC equivalent of a keyword, as we’re using here).
The last thing we need to do for our relationships is double-click on each line to ensure the following options are selected for each so that our dashboard works properly:
The cardinality is Many to 1
The relationship is “active”
The cross filter direction is set to “both”
We are now ready to start building our Intent Dashboard and analyzing our data.
Building the search intent dashboard
In this section I’ll walk you through each visual in the Search Intent Dashboard (as seen below):
Top domains by count of keywords
Visual type: Stacked Bar Chart visual
Axis: I’ve nested URL under Domain so I can drill down to see this same breakdown by URL for a specific Domain
Value: Distinct count of keywords
Legend: Result Types
Filter: Top 10 filter on Domains by count of distinct keywords
Keyword breakdown by result type
Visual type: Donut chart
Legend: Result Types
Value: Count of distinct keywords, shown as Percent of grand total
Metric Cards
Sum of Distinct MSV
Because the Top 20 report shows each keyword 20 times, we need to create a calculated measure in Power BI to only sum MSV for the unique list of keywords. Use this formula for that calculated measure:
Sum Distinct MSV = SUMX(DISTINCT('Table'[Keywords]), FIRSTNONBLANK('Table'[MSV], 0))
Keywords
This is just a distinct count of keywords
Slicer: PPC Conversions
Visual type: Slicer
Drop your PPC Conversions field into a slicer and set the format to “Between” to get this nifty slider visual.
Tables
Visual type: Table or Matrix (a matrix allows for drilling down similar to a pivot table in Excel)
Values: Here I have Category or Intent Stage and then the distinct count of keywords.
Pulling insights from your search intent dashboard
This dashboard is now a Swiss Army knife of data that allows you to slice and dice to your heart’s content. Below are a couple examples of how I use this dashboard to pull out opportunities and insights for my clients.
Where are competitors winning?
With this data we can quickly see who the top competing domains are, but what’s more valuable is seeing who the competitors are for a particular intent stage and category.
I start by filtering to the “Informational” stage, since it represents the most keywords in our dataset. I also filter to the top category for this intent stage which is “Blinds”. Looking at my Keyword Count card, I can now see that I’m looking at a subset of 641 keywords.
Note: To filter multiple visuals in Power BI, you need to press and hold the “Ctrl” button each time you click a new visual to maintain all the filters you clicked previously.
The top competing subdomain here is videos.blinds.com with visibility in the top 20 for over 250 keywords, most of which are for video results. I hit ctrl+click on the Video results portion of videos.blinds.com to update the keywords table to only keywords where videos.blinds.com is ranking in the top 20 with a video result.
From all this I can now say that videos.blinds.com is ranking in the top 20 positions for about 30 percent of keywords that fall into the “Blinds” category and the “Informational” intent stage. I can also see that most of the keywords here start with “how to”, which tells me that most likely people searching for blinds in an informational stage are looking for how to instructions and that video may be a desired content format.
Where should I focus my time?
Whether you’re in-house or at an agency, time is always a hit commodity. You can use this dashboard to quickly identify opportunities that you should be prioritizing first — opportunities that can guarantee you’ll deliver bottom-line results.
To find these bottom-line results, we’re going to filter our data using the PPC conversions slicer so that our data only includes keywords that have converted at least once in our PPC campaigns.
Once I do that, I can see I’m working with a pretty limited set of keywords that have been bucketed into intent stages, but I can continue by drilling into the “Transactional” intent stage because I want to target queries that are linked to a possible purchase.
Note: Not every keyword will fall into an intent stage if it doesn’t meet the criteria we set. These keywords will still appear in the data, but this is the reason why your total keyword count might not always match the total keyword count in the intent stages or category tables.
From there I want to focus on those “Transactional” keywords that are triggering answer boxes to make sure I have good visibility, since they are converting for me on PPC. To do that, I filter to only show keywords triggering answer boxes. Based on these filters I can look at my keyword table and see most (if not all) of the keywords are “installation” keywords and I don’t see my client’s domain in the top list of competitors. This is now an area of focus for me to start driving organic conversions.
Wrap up
I’ve only just scratched the surface — there’s tons that can can be done with this data inside a tool like Power BI. Having a solid data set of keywords and visuals that I can revisit repeatedly for a client and continuously pull out opportunities to help fuel our strategy is, for me, invaluable. I can work efficiently without having to go back to keyword tools whenever I need an idea. Hopefully you find this makes building an intent-based strategy more efficient and sound for your business or clients.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
via Blogger http://bit.ly/2EiaGpj
0 notes
Text
Build a Search Intent Dashboard to Unlock Better Opportunities
Posted by scott.taft
We've been talking a lot about search intent this week, and if you've been following along, you’re likely already aware of how “search intent” is essential for a robust SEO strategy. If, however, you’ve ever laboured for hours classifying keywords by topic and search intent, only to end up with a ton of data you don’t really know what to do with, then this post is for you.
I’m going to share how to take all that sweet keyword data you’ve categorized, put it into a Power BI dashboard, and start slicing and dicing to uncover a ton insights — faster than you ever could before.
Building your keyword list
Every great search analysis starts with keyword research and this one is no different. I’m not going to go into excruciating detail about how to build your keyword list. However, I will mention a few of my favorite tools that I’m sure most of you are using already:
Search Query Report — What better place to look first than the search terms already driving clicks and (hopefully) conversions to your site.
Answer The Public — Great for pulling a ton of suggested terms, questions and phrases related to a single search term.
InfiniteSuggest — Like Answer The Public, but faster and allows you to build based on a continuous list of seed keywords.
MergeWords — Quickly expand your keywords by adding modifiers upon modifiers.
Grep Words — A suite of keyword tools for expanding, pulling search volume and more.
Please note that these tools are a great way to scale your keyword collecting but each will come with the need to comb through and clean your data to ensure all keywords are at least somewhat relevant to your business and audience.
Once I have an initial keyword list built, I’ll upload it to STAT and let it run for a couple days to get an initial data pull. This allows me to pull the ‘People Also Ask’ and ‘Related Searches’ reports in STAT to further build out my keyword list. All in all, I’m aiming to get to at least 5,000 keywords, but the more the merrier.
For the purposes of this blog post I have about 19,000 keywords I collected for a client in the window treatments space.
Categorizing your keywords by topic
Bucketing keywords into categories is an age-old challenge for most digital marketers but it’s a critical step in understanding the distribution of your data. One of the best ways to segment your keywords is by shared words. If you’re short on AI and machine learning capabilities, look no further than a trusty Ngram analyzer. I love to use this Ngram Tool from guidetodatamining.com — it ain’t much to look at, but it’s fast and trustworthy.
After dropping my 19,000 keywords into the tool and analyzing by unigram (or 1-word phrases), I manually select categories that fit with my client’s business and audience. I also make sure the unigram accounts for a decent amount of keywords (e.g. I wouldn’t pick a unigram that has a count of only 2 keywords).
Using this data, I then create a Category Mapping table and map a unigram, or “trigger word”, to a Category like the following:
You’ll notice that for “curtain” and “drapes” I mapped both to the Curtains category. For my client’s business, they treat these as the same product, and doing this allows me to account for variations in keywords but ultimately group them how I want for this analysis.
Using this method, I create a Trigger Word-Category mapping based on my entire dataset. It’s possible that not every keyword will fall into a category and that’s okay — it likely means that keyword is not relevant or significant enough to be accounted for.
Creating a keyword intent map
Similar to identifying common topics by which to group your keywords, I’m going to follow a similar process but with the goal of grouping keywords by intent modifier.
Search intent is the end goal of a person using a search engine. Digital marketers can leverage these terms and modifiers to infer what types of results or actions a consumer is aiming for.
For example, if a person searches for “white blinds near me”, it is safe to infer that this person is looking to buy white blinds as they are looking for a physical location that sells them. In this case I would classify “near me” as a “Transactional” modifier. If, however, the person searched “living room blinds ideas” I would infer their intent is to see images or read blog posts on the topic of living room blinds. I might classify this search term as being at the “Inspirational” stage, where a person is still deciding what products they might be interested and, therefore, isn’t quite ready to buy yet.
There is a lot of research on some generally accepted intent modifiers in search and I don’t intent to reinvent the wheel. This handy guide (originally published in STAT) provides a good review of intent modifiers you can start with.
I followed the same process as building out categories to build out my intent mapping and the result is a table of intent triggers and their corresponding Intent stage.
Intro to Power BI
There are tons of resources on how to get started with the free tool Power BI, one of which is from own founder Will Reynold’s video series on using Power BI for Digital Marketing. This is a great place to start if you’re new to the tool and its capabilities.
Note: it’s not about the tool necessarily (although Power BI is a super powerful one). It’s more about being able to look at all of this data in one place and pull insights from it at speeds which Excel just won’t give you. If you’re still skeptical of trying a new tool like Power BI at the end of this post, I urge you to get the free download from Microsoft and give it a try.
Setting up your data in Power BI
Power BI’s power comes from linking multiple datasets together based on common “keys." Think back to your Microsoft Access days and this should all start to sound familiar.
Step 1: Upload your data sources
First, open Power BI and you’ll see a button called “Get Data” in the top ribbon. Click that and then select the data format you want to upload. All of my data for this analysis is in CSV format so I will select the Text/CSV option for all of my data sources. You have to follow these steps for each data source. Click “Load” for each data source.
Step 2: Clean your data
In the Power BI ribbon menu, click the button called “Edit Queries." This will open the Query Editor where we will make all of our data transformations.
The main things you’ll want to do in the Query Editor are the following:
Make sure all data formats make sense (e.g. keywords are formatted as text, numbers are formatted as decimals or whole numbers).
Rename columns as needed.
Create a domain column in your Top 20 report based on the URL column.
Close and apply your changes by hitting the "Edit Queries" button, as seen above.
Step 3: Create relationships between data sources
On the left side of Power BI is a vertical bar with icons for different views. Click the third one to see your relationships view.
In this view, we are going to connect all data sources to our ‘Keywords Bridge’ table by clicking and dragging a line from the field ‘Keyword’ in each table and to ‘Keyword’ in the ‘Keywords Bridge’ table (note that for the PPC Data, I have connected ‘Search Term’ as this is the PPC equivalent of a keyword, as we’re using here).
The last thing we need to do for our relationships is double-click on each line to ensure the following options are selected for each so that our dashboard works properly:
The cardinality is Many to 1
The relationship is “active”
The cross filter direction is set to “both”
We are now ready to start building our Intent Dashboard and analyzing our data.
Building the search intent dashboard
In this section I’ll walk you through each visual in the Search Intent Dashboard (as seen below):
Top domains by count of keywords
Visual type: Stacked Bar Chart visual
Axis: I’ve nested URL under Domain so I can drill down to see this same breakdown by URL for a specific Domain
Value: Distinct count of keywords
Legend: Result Types
Filter: Top 10 filter on Domains by count of distinct keywords
Keyword breakdown by result type
Visual type: Donut chart
Legend: Result Types
Value: Count of distinct keywords, shown as Percent of grand total
Metric Cards
Sum of Distinct MSV
Because the Top 20 report shows each keyword 20 times, we need to create a calculated measure in Power BI to only sum MSV for the unique list of keywords. Use this formula for that calculated measure:
Sum Distinct MSV = SUMX(DISTINCT('Table'[Keywords]), FIRSTNONBLANK('Table'[MSV], 0))
Keywords
This is just a distinct count of keywords
Slicer: PPC Conversions
Visual type: Slicer
Drop your PPC Conversions field into a slicer and set the format to “Between” to get this nifty slider visual.
Tables
Visual type: Table or Matrix (a matrix allows for drilling down similar to a pivot table in Excel)
Values: Here I have Category or Intent Stage and then the distinct count of keywords.
Pulling insights from your search intent dashboard
This dashboard is now a Swiss Army knife of data that allows you to slice and dice to your heart’s content. Below are a couple examples of how I use this dashboard to pull out opportunities and insights for my clients.
Where are competitors winning?
With this data we can quickly see who the top competing domains are, but what’s more valuable is seeing who the competitors are for a particular intent stage and category.
I start by filtering to the “Informational” stage, since it represents the most keywords in our dataset. I also filter to the top category for this intent stage which is “Blinds”. Looking at my Keyword Count card, I can now see that I’m looking at a subset of 641 keywords.
Note: To filter multiple visuals in Power BI, you need to press and hold the “Ctrl” button each time you click a new visual to maintain all the filters you clicked previously.
The top competing subdomain here is videos.blinds.com with visibility in the top 20 for over 250 keywords, most of which are for video results. I hit ctrl+click on the Video results portion of videos.blinds.com to update the keywords table to only keywords where videos.blinds.com is ranking in the top 20 with a video result.
From all this I can now say that videos.blinds.com is ranking in the top 20 positions for about 30 percent of keywords that fall into the “Blinds” category and the “Informational” intent stage. I can also see that most of the keywords here start with “how to”, which tells me that most likely people searching for blinds in an informational stage are looking for how to instructions and that video may be a desired content format.
Where should I focus my time?
Whether you’re in-house or at an agency, time is always a hit commodity. You can use this dashboard to quickly identify opportunities that you should be prioritizing first — opportunities that can guarantee you’ll deliver bottom-line results.
To find these bottom-line results, we’re going to filter our data using the PPC conversions slicer so that our data only includes keywords that have converted at least once in our PPC campaigns.
Once I do that, I can see I’m working with a pretty limited set of keywords that have been bucketed into intent stages, but I can continue by drilling into the “Transactional” intent stage because I want to target queries that are linked to a possible purchase.
Note: Not every keyword will fall into an intent stage if it doesn’t meet the criteria we set. These keywords will still appear in the data, but this is the reason why your total keyword count might not always match the total keyword count in the intent stages or category tables.
From there I want to focus on those “Transactional” keywords that are triggering answer boxes to make sure I have good visibility, since they are converting for me on PPC. To do that, I filter to only show keywords triggering answer boxes. Based on these filters I can look at my keyword table and see most (if not all) of the keywords are “installation” keywords and I don’t see my client’s domain in the top list of competitors. This is now an area of focus for me to start driving organic conversions.
Wrap up
I’ve only just scratched the surface — there’s tons that can can be done with this data inside a tool like Power BI. Having a solid data set of keywords and visuals that I can revisit repeatedly for a client and continuously pull out opportunities to help fuel our strategy is, for me, invaluable. I can work efficiently without having to go back to keyword tools whenever I need an idea. Hopefully you find this makes building an intent-based strategy more efficient and sound for your business or clients.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
Build a Search Intent Dashboard to Unlock Better Opportunities
Posted by scott.taft
We've been talking a lot about search intent this week, and if you've been following along, you’re likely already aware of how “search intent” is essential for a robust SEO strategy. If, however, you’ve ever laboured for hours classifying keywords by topic and search intent, only to end up with a ton of data you don’t really know what to do with, then this post is for you.
I’m going to share how to take all that sweet keyword data you’ve categorized, put it into a Power BI dashboard, and start slicing and dicing to uncover a ton insights — faster than you ever could before.
Building your keyword list
Every great search analysis starts with keyword research and this one is no different. I’m not going to go into excruciating detail about how to build your keyword list. However, I will mention a few of my favorite tools that I’m sure most of you are using already:
Search Query Report — What better place to look first than the search terms already driving clicks and (hopefully) conversions to your site.
Answer The Public — Great for pulling a ton of suggested terms, questions and phrases related to a single search term.
InfiniteSuggest — Like Answer The Public, but faster and allows you to build based on a continuous list of seed keywords.
MergeWords — Quickly expand your keywords by adding modifiers upon modifiers.
Grep Words — A suite of keyword tools for expanding, pulling search volume and more.
Please note that these tools are a great way to scale your keyword collecting but each will come with the need to comb through and clean your data to ensure all keywords are at least somewhat relevant to your business and audience.
Once I have an initial keyword list built, I’ll upload it to STAT and let it run for a couple days to get an initial data pull. This allows me to pull the ‘People Also Ask’ and ‘Related Searches’ reports in STAT to further build out my keyword list. All in all, I’m aiming to get to at least 5,000 keywords, but the more the merrier.
For the purposes of this blog post I have about 19,000 keywords I collected for a client in the window treatments space.
Categorizing your keywords by topic
Bucketing keywords into categories is an age-old challenge for most digital marketers but it’s a critical step in understanding the distribution of your data. One of the best ways to segment your keywords is by shared words. If you’re short on AI and machine learning capabilities, look no further than a trusty Ngram analyzer. I love to use this Ngram Tool from guidetodatamining.com — it ain’t much to look at, but it’s fast and trustworthy.
After dropping my 19,000 keywords into the tool and analyzing by unigram (or 1-word phrases), I manually select categories that fit with my client’s business and audience. I also make sure the unigram accounts for a decent amount of keywords (e.g. I wouldn’t pick a unigram that has a count of only 2 keywords).
Using this data, I then create a Category Mapping table and map a unigram, or “trigger word”, to a Category like the following:
You’ll notice that for “curtain” and “drapes” I mapped both to the Curtains category. For my client’s business, they treat these as the same product, and doing this allows me to account for variations in keywords but ultimately group them how I want for this analysis.
Using this method, I create a Trigger Word-Category mapping based on my entire dataset. It’s possible that not every keyword will fall into a category and that’s okay — it likely means that keyword is not relevant or significant enough to be accounted for.
Creating a keyword intent map
Similar to identifying common topics by which to group your keywords, I’m going to follow a similar process but with the goal of grouping keywords by intent modifier.
Search intent is the end goal of a person using a search engine. Digital marketers can leverage these terms and modifiers to infer what types of results or actions a consumer is aiming for.
For example, if a person searches for “white blinds near me”, it is safe to infer that this person is looking to buy white blinds as they are looking for a physical location that sells them. In this case I would classify “near me” as a “Transactional” modifier. If, however, the person searched “living room blinds ideas” I would infer their intent is to see images or read blog posts on the topic of living room blinds. I might classify this search term as being at the “Inspirational” stage, where a person is still deciding what products they might be interested and, therefore, isn’t quite ready to buy yet.
There is a lot of research on some generally accepted intent modifiers in search and I don’t intent to reinvent the wheel. This handy guide (originally published in STAT) provides a good review of intent modifiers you can start with.
I followed the same process as building out categories to build out my intent mapping and the result is a table of intent triggers and their corresponding Intent stage.
Intro to Power BI
There are tons of resources on how to get started with the free tool Power BI, one of which is from own founder Will Reynold’s video series on using Power BI for Digital Marketing. This is a great place to start if you’re new to the tool and its capabilities.
Note: it’s not about the tool necessarily (although Power BI is a super powerful one). It’s more about being able to look at all of this data in one place and pull insights from it at speeds which Excel just won’t give you. If you’re still skeptical of trying a new tool like Power BI at the end of this post, I urge you to get the free download from Microsoft and give it a try.
Setting up your data in Power BI
Power BI’s power comes from linking multiple datasets together based on common “keys." Think back to your Microsoft Access days and this should all start to sound familiar.
Step 1: Upload your data sources
First, open Power BI and you’ll see a button called “Get Data” in the top ribbon. Click that and then select the data format you want to upload. All of my data for this analysis is in CSV format so I will select the Text/CSV option for all of my data sources. You have to follow these steps for each data source. Click “Load” for each data source.
Step 2: Clean your data
In the Power BI ribbon menu, click the button called “Edit Queries." This will open the Query Editor where we will make all of our data transformations.
The main things you’ll want to do in the Query Editor are the following:
Make sure all data formats make sense (e.g. keywords are formatted as text, numbers are formatted as decimals or whole numbers).
Rename columns as needed.
Create a domain column in your Top 20 report based on the URL column.
Close and apply your changes by hitting the "Edit Queries" button, as seen above.
Step 3: Create relationships between data sources
On the left side of Power BI is a vertical bar with icons for different views. Click the third one to see your relationships view.
In this view, we are going to connect all data sources to our ‘Keywords Bridge’ table by clicking and dragging a line from the field ‘Keyword’ in each table and to ‘Keyword’ in the ‘Keywords Bridge’ table (note that for the PPC Data, I have connected ‘Search Term’ as this is the PPC equivalent of a keyword, as we’re using here).
The last thing we need to do for our relationships is double-click on each line to ensure the following options are selected for each so that our dashboard works properly:
The cardinality is Many to 1
The relationship is “active”
The cross filter direction is set to “both”
We are now ready to start building our Intent Dashboard and analyzing our data.
Building the search intent dashboard
In this section I’ll walk you through each visual in the Search Intent Dashboard (as seen below):
Top domains by count of keywords
Visual type: Stacked Bar Chart visual
Axis: I’ve nested URL under Domain so I can drill down to see this same breakdown by URL for a specific Domain
Value: Distinct count of keywords
Legend: Result Types
Filter: Top 10 filter on Domains by count of distinct keywords
Keyword breakdown by result type
Visual type: Donut chart
Legend: Result Types
Value: Count of distinct keywords, shown as Percent of grand total
Metric Cards
Sum of Distinct MSV
Because the Top 20 report shows each keyword 20 times, we need to create a calculated measure in Power BI to only sum MSV for the unique list of keywords. Use this formula for that calculated measure:
Sum Distinct MSV = SUMX(DISTINCT('Table'[Keywords]), FIRSTNONBLANK('Table'[MSV], 0))
Keywords
This is just a distinct count of keywords
Slicer: PPC Conversions
Visual type: Slicer
Drop your PPC Conversions field into a slicer and set the format to “Between” to get this nifty slider visual.
Tables
Visual type: Table or Matrix (a matrix allows for drilling down similar to a pivot table in Excel)
Values: Here I have Category or Intent Stage and then the distinct count of keywords.
Pulling insights from your search intent dashboard
This dashboard is now a Swiss Army knife of data that allows you to slice and dice to your heart’s content. Below are a couple examples of how I use this dashboard to pull out opportunities and insights for my clients.
Where are competitors winning?
With this data we can quickly see who the top competing domains are, but what’s more valuable is seeing who the competitors are for a particular intent stage and category.
I start by filtering to the “Informational” stage, since it represents the most keywords in our dataset. I also filter to the top category for this intent stage which is “Blinds”. Looking at my Keyword Count card, I can now see that I’m looking at a subset of 641 keywords.
Note: To filter multiple visuals in Power BI, you need to press and hold the “Ctrl” button each time you click a new visual to maintain all the filters you clicked previously.
The top competing subdomain here is videos.blinds.com with visibility in the top 20 for over 250 keywords, most of which are for video results. I hit ctrl+click on the Video results portion of videos.blinds.com to update the keywords table to only keywords where videos.blinds.com is ranking in the top 20 with a video result.
From all this I can now say that videos.blinds.com is ranking in the top 20 positions for about 30 percent of keywords that fall into the “Blinds” category and the “Informational” intent stage. I can also see that most of the keywords here start with “how to”, which tells me that most likely people searching for blinds in an informational stage are looking for how to instructions and that video may be a desired content format.
Where should I focus my time?
Whether you’re in-house or at an agency, time is always a hit commodity. You can use this dashboard to quickly identify opportunities that you should be prioritizing first — opportunities that can guarantee you’ll deliver bottom-line results.
To find these bottom-line results, we’re going to filter our data using the PPC conversions slicer so that our data only includes keywords that have converted at least once in our PPC campaigns.
Once I do that, I can see I’m working with a pretty limited set of keywords that have been bucketed into intent stages, but I can continue by drilling into the “Transactional” intent stage because I want to target queries that are linked to a possible purchase.
Note: Not every keyword will fall into an intent stage if it doesn’t meet the criteria we set. These keywords will still appear in the data, but this is the reason why your total keyword count might not always match the total keyword count in the intent stages or category tables.
From there I want to focus on those “Transactional” keywords that are triggering answer boxes to make sure I have good visibility, since they are converting for me on PPC. To do that, I filter to only show keywords triggering answer boxes. Based on these filters I can look at my keyword table and see most (if not all) of the keywords are “installation” keywords and I don’t see my client’s domain in the top list of competitors. This is now an area of focus for me to start driving organic conversions.
Wrap up
I’ve only just scratched the surface — there’s tons that can can be done with this data inside a tool like Power BI. Having a solid data set of keywords and visuals that I can revisit repeatedly for a client and continuously pull out opportunities to help fuel our strategy is, for me, invaluable. I can work efficiently without having to go back to keyword tools whenever I need an idea. Hopefully you find this makes building an intent-based strategy more efficient and sound for your business or clients.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from https://dentistry01.blogspot.com/2019/02/build-search-intent-dashboard-to-unlock.html
0 notes
Text
Build a Search Intent Dashboard to Unlock Better Opportunities
Posted by scott.taft
We’ve been talking a lot about search intent this week, and if you’ve been following along, you’re likely already aware of how “search intent” is essential for a robust SEO strategy. If, however, you’ve ever laboured for hours classifying keywords by topic and search intent, only to end up with a ton of data you don’t really know what to do with, then this post is for you.
I’m going to share how to take all that sweet keyword data you’ve categorized, put it into a Power BI dashboard, and start slicing and dicing to uncover a ton insights — faster than you ever could before.
Building your keyword list
Every great search analysis starts with keyword research and this one is no different. I’m not going to go into excruciating detail about how to build your keyword list. However, I will mention a few of my favorite tools that I’m sure most of you are using already:
Search Query Report — What better place to look first than the search terms already driving clicks and (hopefully) conversions to your site.
Answer The Public — Great for pulling a ton of suggested terms, questions and phrases related to a single search term.
InfiniteSuggest — Like Answer The Public, but faster and allows you to build based on a continuous list of seed keywords.
MergeWords — Quickly expand your keywords by adding modifiers upon modifiers.
Grep Words — A suite of keyword tools for expanding, pulling search volume and more.
Please note that these tools are a great way to scale your keyword collecting but each will come with the need to comb through and clean your data to ensure all keywords are at least somewhat relevant to your business and audience.
Once I have an initial keyword list built, I’ll upload it to STAT and let it run for a couple days to get an initial data pull. This allows me to pull the ‘People Also Ask’ and ‘Related Searches’ reports in STAT to further build out my keyword list. All in all, I’m aiming to get to at least 5,000 keywords, but the more the merrier.
For the purposes of this blog post I have about 19,000 keywords I collected for a client in the window treatments space.
Categorizing your keywords by topic
Bucketing keywords into categories is an age-old challenge for most digital marketers but it’s a critical step in understanding the distribution of your data. One of the best ways to segment your keywords is by shared words. If you’re short on AI and machine learning capabilities, look no further than a trusty Ngram analyzer. I love to use this Ngram Tool from guidetodatamining.com — it ain’t much to look at, but it’s fast and trustworthy.
After dropping my 19,000 keywords into the tool and analyzing by unigram (or 1-word phrases), I manually select categories that fit with my client’s business and audience. I also make sure the unigram accounts for a decent amount of keywords (e.g. I wouldn’t pick a unigram that has a count of only 2 keywords).
Using this data, I then create a Category Mapping table and map a unigram, or “trigger word”, to a Category like the following:
You’ll notice that for “curtain” and “drapes” I mapped both to the Curtains category. For my client’s business, they treat these as the same product, and doing this allows me to account for variations in keywords but ultimately group them how I want for this analysis.
Using this method, I create a Trigger Word-Category mapping based on my entire dataset. It’s possible that not every keyword will fall into a category and that’s okay — it likely means that keyword is not relevant or significant enough to be accounted for.
Creating a keyword intent map
Similar to identifying common topics by which to group your keywords, I’m going to follow a similar process but with the goal of grouping keywords by intent modifier.
Search intent is the end goal of a person using a search engine. Digital marketers can leverage these terms and modifiers to infer what types of results or actions a consumer is aiming for.
For example, if a person searches for “white blinds near me”, it is safe to infer that this person is looking to buy white blinds as they are looking for a physical location that sells them. In this case I would classify “near me” as a “Transactional” modifier. If, however, the person searched “living room blinds ideas” I would infer their intent is to see images or read blog posts on the topic of living room blinds. I might classify this search term as being at the “Inspirational” stage, where a person is still deciding what products they might be interested and, therefore, isn’t quite ready to buy yet.
There is a lot of research on some generally accepted intent modifiers in search and I don’t intent to reinvent the wheel. This handy guide (originally published in STAT) provides a good review of intent modifiers you can start with.
I followed the same process as building out categories to build out my intent mapping and the result is a table of intent triggers and their corresponding Intent stage.
Intro to Power BI
There are tons of resources on how to get started with the free tool Power BI, one of which is from own founder Will Reynold’s video series on using Power BI for Digital Marketing. This is a great place to start if you’re new to the tool and its capabilities.
Note: it’s not about the tool necessarily (although Power BI is a super powerful one). It’s more about being able to look at all of this data in one place and pull insights from it at speeds which Excel just won’t give you. If you’re still skeptical of trying a new tool like Power BI at the end of this post, I urge you to get the free download from Microsoft and give it a try.
Setting up your data in Power BI
Power BI’s power comes from linking multiple datasets together based on common “keys.“ Think back to your Microsoft Access days and this should all start to sound familiar.
Step 1: Upload your data sources
First, open Power BI and you’ll see a button called “Get Data” in the top ribbon. Click that and then select the data format you want to upload. All of my data for this analysis is in CSV format so I will select the Text/CSV option for all of my data sources. You have to follow these steps for each data source. Click “Load” for each data source.
Step 2: Clean your data
In the Power BI ribbon menu, click the button called “Edit Queries.” This will open the Query Editor where we will make all of our data transformations.
The main things you’ll want to do in the Query Editor are the following:
Make sure all data formats make sense (e.g. keywords are formatted as text, numbers are formatted as decimals or whole numbers).
Rename columns as needed.
Create a domain column in your Top 20 report based on the URL column.
Close and apply your changes by hitting the “Edit Queries” button, as seen above.
Step 3: Create relationships between data sources
On the left side of Power BI is a vertical bar with icons for different views. Click the third one to see your relationships view.
In this view, we are going to connect all data sources to our ‘Keywords Bridge’ table by clicking and dragging a line from the field ‘Keyword’ in each table and to ‘Keyword’ in the ‘Keywords Bridge’ table (note that for the PPC Data, I have connected ‘Search Term’ as this is the PPC equivalent of a keyword, as we’re using here).
The last thing we need to do for our relationships is double-click on each line to ensure the following options are selected for each so that our dashboard works properly:
The cardinality is Many to 1
The relationship is “active”
The cross filter direction is set to “both”
We are now ready to start building our Intent Dashboard and analyzing our data.
Building the search intent dashboard
In this section I’ll walk you through each visual in the Search Intent Dashboard (as seen below):
Top domains by count of keywords
Visual type: Stacked Bar Chart visual
Axis: I’ve nested URL under Domain so I can drill down to see this same breakdown by URL for a specific Domain
Value: Distinct count of keywords
Legend: Result Types
Filter: Top 10 filter on Domains by count of distinct keywords
Keyword breakdown by result type
Visual type: Donut chart
Legend: Result Types
Value: Count of distinct keywords, shown as Percent of grand total
Metric Cards
Sum of Distinct MSV
Because the Top 20 report shows each keyword 20 times, we need to create a calculated measure in Power BI to only sum MSV for the unique list of keywords. Use this formula for that calculated measure:
Sum Distinct MSV = SUMX(DISTINCT('Table'[Keywords]), FIRSTNONBLANK('Table'[MSV], 0))
Keywords
This is just a distinct count of keywords
Slicer: PPC Conversions
Visual type: Slicer
Drop your PPC Conversions field into a slicer and set the format to “Between” to get this nifty slider visual.
Tables
Visual type: Table or Matrix (a matrix allows for drilling down similar to a pivot table in Excel)
Values: Here I have Category or Intent Stage and then the distinct count of keywords.
Pulling insights from your search intent dashboard
This dashboard is now a Swiss Army knife of data that allows you to slice and dice to your heart’s content. Below are a couple examples of how I use this dashboard to pull out opportunities and insights for my clients.
Where are competitors winning?
With this data we can quickly see who the top competing domains are, but what’s more valuable is seeing who the competitors are for a particular intent stage and category.
I start by filtering to the “Informational” stage, since it represents the most keywords in our dataset. I also filter to the top category for this intent stage which is “Blinds”. Looking at my Keyword Count card, I can now see that I’m looking at a subset of 641 keywords.
Note: To filter multiple visuals in Power BI, you need to press and hold the “Ctrl” button each time you click a new visual to maintain all the filters you clicked previously.
The top competing subdomain here is videos.blinds.com with visibility in the top 20 for over 250 keywords, most of which are for video results. I hit ctrl+click on the Video results portion of videos.blinds.com to update the keywords table to only keywords where videos.blinds.com is ranking in the top 20 with a video result.
From all this I can now say that videos.blinds.com is ranking in the top 20 positions for about 30 percent of keywords that fall into the “Blinds” category and the “Informational” intent stage. I can also see that most of the keywords here start with “how to”, which tells me that most likely people searching for blinds in an informational stage are looking for how to instructions and that video may be a desired content format.
Where should I focus my time?
Whether you’re in-house or at an agency, time is always a hit commodity. You can use this dashboard to quickly identify opportunities that you should be prioritizing first — opportunities that can guarantee you’ll deliver bottom-line results.
To find these bottom-line results, we’re going to filter our data using the PPC conversions slicer so that our data only includes keywords that have converted at least once in our PPC campaigns.
Once I do that, I can see I’m working with a pretty limited set of keywords that have been bucketed into intent stages, but I can continue by drilling into the “Transactional” intent stage because I want to target queries that are linked to a possible purchase.
Note: Not every keyword will fall into an intent stage if it doesn’t meet the criteria we set. These keywords will still appear in the data, but this is the reason why your total keyword count might not always match the total keyword count in the intent stages or category tables.
From there I want to focus on those “Transactional” keywords that are triggering answer boxes to make sure I have good visibility, since they are converting for me on PPC. To do that, I filter to only show keywords triggering answer boxes. Based on these filters I can look at my keyword table and see most (if not all) of the keywords are “installation” keywords and I don’t see my client’s domain in the top list of competitors. This is now an area of focus for me to start driving organic conversions.
Wrap up
I’ve only just scratched the surface — there’s tons that can can be done with this data inside a tool like Power BI. Having a solid data set of keywords and visuals that I can revisit repeatedly for a client and continuously pull out opportunities to help fuel our strategy is, for me, invaluable. I can work efficiently without having to go back to keyword tools whenever I need an idea. Hopefully you find this makes building an intent-based strategy more efficient and sound for your business or clients.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Build a Search Intent Dashboard to Unlock Better Opportunities published first on http://goproski.com/
0 notes