#wonkydata
Explore tagged Tumblr posts
Text
Ya... OK, but what 3 calls can I make because of this data?
* Photo credit -Entertainment.ie
This was what I was asked after an hour meeting with one of the top entertainment agencies in world, talking with them about working on analytics projects together.
What does this get me?
I’ve spent a career in the creation of data - rather, creating data, and translating it into information through lots of hours and money and teams of people.
Some companies consider data to be an asset. I agree and disagree. Chairs are assets, as are computers, buildings, and material goods.
If data is an asset, then it should be easy to see its worth.
In order to be an asset, it needs to be used, like cement to build a building, or programming to build the OS for a phone, or decision criteria helping you know when to launch your new electronic vehicle and at what pricepoint.
In the case of the agency who posed this challenge to me, it was the kind of question that you dream of hearing from a company. It means that they possess the keen understanding that everything in their business is designed to either enhance or detract from making goals move forward for the company.
I’ve only heard this question asked a handful of times, in 30 years.
As an analytics and business intelligence creator of data systems for nearly 30 years, I’ve seen data stores grow to exceed the capabilities of CPUs, Memory, and Storage again and again and... In each company, we huddled together and vowed to reduce our overall data storage, reviewed the types and volumes of information being gathered and reduce to needs only, and analyzed reporting requirements precisely before agreeing to future data-increasing projects. Then, we’d eventually forget about it for another few years after buying the next amazing, eclipsing computers and storage systems. Massive, fast, a little bit pricier per unit, but soooo powerful... (RAID, SAN, Parallel servers (Hadoop clusters), now many splendored flavors of Cloud).
Behavior like this can lead to a misuse of cloud computing too. Nothing prevents bad design except for great design practices.
"if fashion were easy, wouldn’t everyone look great?” - Tim Gunn
When storage and CPU and overall scalability are no longer huge concerns because you don’t have to build your own data center any more, there are far fewer barriers to over-creating data stores and even less encouragement for using them for something results-oriented.
I warn about this, because it’s a sine-wave pattern that follows Moore’s Law absolutely.
“...the number of transistors in a dense integrated circuit doubles approximately every two years. “
My time at Intel cemented this, since he was one of the founders, and co-inventor of the integrated circuit at Fairchild.
Computing capability follows the same pattern, and it’s because the inventiveness of humanity and our desire for more collide and complement in an arm wrestle for dominance. In the early days it was led by hardware concerns. Now, it’s the internet, and our expectation of continuous, high speed, and richer service and experience through networks and computer, TVs and phones that have graduated to the role of appliances to purvey our information, communication and entertainment needs. What was a novelty in the 1980′s and 1990′s is now something that barely generates a shrug, unless service is down.
Today, hardware is still key and core, but the struggle to build faster, better, cheaper to fuel the rate of growth in the internet is a hidden one unless you’re Intel, AMD, Samsung, etc.
Technology grows. Data sizes grow. Investment grows. The singular best question remains, and if it is unanswered, then every single bit of what is considered a “data asset” should be questioned, and in some cases, radically shut down.
When I was at Intel during the 1990s and 2000′s, I personally managed large budget projects and teams who’s end results were enormous, well-organized, strategically devised data stores, with excellent reporting. In the balance of things, the data was often used in a handful of reports, that were printed on the FAB (factory) floor in clean room paper, and handed out manually to changing shifts of production engineers. Fancy dashboards gave way to a one page printout, backed by hundreds even thousands of person-hours to create the statistical models and simple algebraic equations that resulted in informative numbers that told them to do a few simple tasks, but in the right way, at the right time, and with the right product. The data also categorized CPUs into their proper “Bins” of MHz quality. Translation: how fast the CPU was, and what to charge for it.
That’s environment I came from when I moved into the internet world during my baptism at MySpace for a brief but exciting time. Then as I decided to become an analytics practitioner and director, I realized that analytics were back in the earlier days of creating tons of information, and not always knowing what to do with it, but wasn’t Google Analtyics, or Adobe creating pretty reports? More events maybe? Dimensions? We can do that, with some hours of work.
Today, the right questions are emerging, and what I’m encouraged about is that while in L.A., the top of Hollywood has been asking them for years.
“What three (things) phone calls can I (do) make because of this data?”
What three decisions can you make from your data today, that result in increased revenue, or decreased cost, or an informative stance regarding the launch or removal of a major project and service shift for your company?
What information doesn’t boil up from the vats in the basement, to distill into a fine digestif to steady our stomachs so engorged with data?
The question is easy, but it has to be asked at the right levels in a company, and led into action by the top-most leader. Centralized embrace of analytics and business information direction is paramount, and that’s why this simple example question was so interesting.
The right question was asked, at the right level in the company, and it’s being led and directed by that person. The CEO.
One of the key tenets of the ground-making and breaking book on analytics called Competing on Analytics says (paraphrasing) that it’s vitally important for the analytics competitor to have that process of creating analtyics greatness led by the top of the company. It has to be their passion, and to be taken seriously and enforced at every level down in the company.
Is our information a revenue driver, or a revenue detractor? Are the costs for creating and storing our information today, tomorrow higher than the benefits we are currently accruing from our information resources? Are we still paying for past mistakes by keeping huge or costly information assets afloat? What are we going to get from this project, and when?
If you’re building a building, hopefully the costs don’t exceed the projected net revenue you will get from selling it, or renting out the space over the next century after it’s built. Hope isn’t a factor. Financials are, and buildings are generally built on clear targets relative to cost and time.
Since information is wobbly, and easy to hide, and also easy to talk-up as an asset, it’s super important to keep the blinders on about it’s data-greatness by itself, and remain focused on answering the question of the ages, which is just a restatement of the main driver question of this article.
Why does it have value?
If your CEO doesn’t believe in your information resource, and you can’t identify its value in a few sentences, then it doesn’t have value.
If you buy a million gallons of gasoline and store it underground in case you might need it someday, there are two main outcomes that can be easily anticipated with a little thought
It goes bad. Stale gasoline isn’t usable.
It will someday be worthless. Electricity is going to win.
Data resources are similar. Don’t create it, buy it, or store it unless it’s possible to clearly identify a direct through-line from its existence, to even a singular use where it enhances your business. It could have value in a couple of years, and that’s ok, as long as it can be clearly traced to an absolute use case that achieves well articulated business objectives.
CEO’s, what three things can you do with your data today that improve business?
Assets can be sold. If you can’t sell your data, it isn’t an asset.
#google analytics#googleanalytics#business intelligence#data#information#analytics#data science#machine learning#google cloud services#cebridges#wonkydata#business analysis#forbes#google
1 note
·
View note
Text
Ten Security Assists from GTM
I first started using GTM in about 2013. Before that, I was prevented.
Even though I was Director of Analytics for a super duper big company with an $X00M advertising spend under my supervision, I couldn’t get the approval for the developer time needed to convert the sites and apps from analytics.js to the GTM snippet. Also there was a degree of fear associated with the concept of the “remote control” nature of a TMS for our websites.
Could malicious users/bots get control of our tagging and take our data?
Worse yet, could someone do js writes to our site and take over content?!
Would someone try to take away donut Friday?! (that was real)
Sometimes it comes down to fear. In my 25+ years in the software engineering, semiconductor automation, internet engineering, and analytics of everything industries, decision-making in tech hasn’t changed. We’re primates at heart and all of us have basic reactions to the unknown.
While our ancestors quaked in fear when they they first saw fire, we now shudder during the technology review process. Our gut tightens. Heart-rate elevates. Cerebral cortex gets flooded with blood while our left and right hemispheres have less, weakening reasoning but peaking our fight or flight reactions.
Except, we’re in office chairs, maybe in Lulu Lemon yoga garb, and we’re barely listening to a boring powerpoint preso. A little “nam-myoho-renge-kyo” might have nailed it.
In my company, I saw it as an opportunity to to raise the level of consciousness.
After many powerpoints, lively discussions, and then a repeat of the exact same powerpoints (yup, that does work), we all won by forever adopting GTM. From that point on we never looked back.
In the review process, security was a real concern for my president. This was primarily because he read several articles outlining conceptual security issues in tag management systems. I’m not proposing that TMS’s are free from security risk. But when comparing them to including all analytics and other tagging on the page in plain site, it becomes clearer. On-page tagging means that that anyone with a laptop, 60 seconds, a coffee, and chrome developer tools can see, copy, hack and control your content or data.
TMSs like GTM offer a layer of off-page obfuscation by design. The heart of GTM tagging and its logic doesn’t reside in the code at all. It’s in the hands of only a select few who have access in the cloud. By default, more security.
The Google GTM team is vigilant about security, and have created some amazing security features that also add to ease of use.
10 Key GTM Security Features
In-page tag whitelist/blacklist
This feature gives control back to site owners, the release process, or to developers to determine the types of Tags, Triggers, and Variables that are allowed to run on your site. They are whitelisted or blacklisted based upon ID numbers associated to each element type, preset in GTM.
For example, Custom HTML tags can be simply disallowed. Some of my clients have chosen this exclusion as it provides not only security comfort, but high performance by excluding customization outside of standard tags.
It enables a secure set of “keys” that the company can use to approve / disapprove assets, making it possible for a business to nimbly change security allowances.
https://developers.google.com/tag-manager/devguide#restricting-tag-deployment
Ability to require 2FA - Two Factor Approval for Changes:
My new favorite feature. Two-Factor Approval can be setup so that no one can do any of the following without 2FA.
Publish
Create/Edit JavaScript variables
Create/Edit custom HTML tags
Anything User Settings
https://support.google.com/tagmanager/answer/4525539?hl=en
User Permissions
At the container level, users can be granted or revoked access.
Read, Edit, Approve, and Publish are all separate levels to be approved.
https://support.google.com/tagmanager/answer/6107011?hl=en
Approval Workflow
For 360 account holders, the approval workflow using GTM Workspaces makes it possible to allow multiple groups or users to simultaneously make changes in isolation.
These workspaces can then enter an approval flow, enforced by User Permissions in #3, and a process flow.
See Simo’s article: https://www.simoahava.com/analytics/new-approval-workflow-gtm-360/#comments
Testing and Release: Preview Mode & Environments
Preview Mode:
Preview mode is a super great option for debug testing container changes.
Simply turn on Preview mode, and the debugger window will pop up on your site when you launch it. I especially feel gratitude for the ability to dive into the dataLayer and variables at any event.
Preview mode was updated a bit ago to let you simply “refresh” it with a click, allowing you to do live edits while still in preview
https://support.google.com/tagmanager/answer/6107056?hl=en
Environments
This capability is an amazing way to do what we used to do in the olden days by creating a test container that resided in a test environment on our site, and then was manually disabled and enabled, etc.
Instead, we now have the ability to create multiple server environments, configured in GTM at the container. Great for isolating Dev, Test, and Prod without impacting any of them outside of our release process. Bravo!
https://support.google.com/tagmanager/answer/6311518?hl=en
Change History
Very simply, versioning of containers is a very simple, great process for clearly identifying releases by name, purpose of the release, name of creator, and creation date. Creates a permanent tag audit trail.
Also, version history provides a side by side comparison of before and after changes, even highlighting the changes in a different color. It’s awesome, and saves hours of time debugging, and in identifying potentially unwelcome changes to the container.
Versions can be rolled back or removed entirely.
Programmatic API
Welcome to remote control for remote control. With the GTM APIs, you can build secure tools to monitor and take actions on your container via 2FA scripts. Simo has created a pretty amazing set of GTM Tools using the GTM APIs.
Here’s the list of what you can view or change from an authorized account: Accounts, Containers, Workspaces, Tags, Triggers, Folders, Built-In Variables, Variables, Container Versions, Container Version Headers, User Permissions, Environments.
https://developers.google.com/tag-manager/api/v2/
Vendor Tag Template Program
GTM has created a huge, and ever-growing list of pre-baked integrations with key vendors called the “Vendor Tag Template Program”.
For security, this is a huge win. The code behind these integrations is buried deep in GTM-land. Only Google has access and it’s bullet-proofed with each vendor in the acceptance program.
It removes the need to create Custom HTML tags or Custom Javascript variables for each vendor tag that gets created. This by itself removes the risk for malware and security issues involved with custom tags. TMS’s that rely on custom HTML tags (several) are most exposed to security risks. GTM is actively working to make those undesirable by providing excellent vendor integrations.
See here for an updated List of GTM Vendor Tags on Wonkydata.com (will update as they are released).
Malware Scanning
Without us knowing, GTM automatically scans your containers for malware. If any is found, it blocks the container from firing and signals issues.
It’s totally in Google’s interest to always make sure your containers fire, and that they fire completely free of malware and malicious code. Continual update of their methods for identifying will ensure persistence and accuracy in KTBR (keeping the business running).
https://support.google.com/tagmanager/answer/6328489?hl=en&ref_topic=3441532
Contextual Auto-escaping
Another behind the closed doors of GTM security precaution that helps to remove and prevent unsafe HTML and XSS from running in custom tags.
Custom tags are analyzed, “sanitized” and escaped by a robust engine.
Check-in for updates on security items for GTM.
Google likes making things secure. It keeps business going, makes ads render, and makes it possible for content and user experiences to grow like fields of wild flowers.
GTM is an incredible tool. It’s become a ubiquitous tag management system, deeply featured for full path staging and production releases, provides excellent security precautions, scalable to any enterprise, is constantly being grown, and it’s totally free.
#analytics#google tag manager#gtm#google analytics#GoogleAnalytics#Googletagmanager#google360#GA#analytis consulting#cebridges#wonkydata#google
0 notes