#or even crediting ai with the increased amount of data centers like we haven’t needed more processing power As Humans with the more complex
Explore tagged Tumblr posts
Text
I cannot believe how fast misinformation spread about ai energy use like I get it you want to have ethical backing for your anti ai argument and discussions of things like “labor laws” and “copyright” are much more complicated than invoking global warming or the human soul but like jesus. It’s becoming very clear to me that most people don’t know how the internet or technology works at all.
#like im starting to think you guys don’t actually care about artists and just want to say shit…#most of the articles I’ve found abt this are either projections for if ai use keeps growing at its current rate (doubtful) or comparing it#to things like a single plane flight#the comparisons are never to smth like bitcoin mining or a city’s electricity or something actually on its scale#or even crediting ai with the increased amount of data centers like we haven’t needed more processing power As Humans with the more complex#shit online and being calculated by scientists rn#and all of this is only to train the model. not even to use it.#and like to be clear. I am an artist who has lost jobs to ai#but it’s here now and we have to think about living with it into the future#warlock wartalks
13 notes
·
View notes
Text
Top New Technology Trends for 2021
Today's technology is advancing rapidly, allowing for faster change and advancement, causing the rate of change to accelerate until it eventually becomes exponential. However, it's not just technology trends and cutting edge technologies that are evolving. A lot more has changed this year due to the COVID-19 outbreak, so IT pros are realizing their role in tomorrow's contactless world will not stay the same. And an IT expert in 2020-21 will constantly learn, unlearn and learn anew (out of necessity, if not out of desire).
what does that mean to you? It means staying up to date with new technology trends. And it means keeping an eye on the future to know what skills you will need to secure a safe job tomorrow and even learn how to get there. Everyone is bowing to the global pandemic, most of the global IT population is leaning back and working from home. And if you're looking to make the most of your time at home, here are the top ten new technology trends to watch out for and try out in 2021 to potentially save one of the jobs these new technology trends will create.
Data Science
Data science is a universe in itself. It is a systematic study of raw data and making insightful observations. From those observations, one can take relevant actions to establish a goal. Data acquisition, data cleaning, feature engineering, modeling, and visualization are some major parts of this universe.
Artificial Intelligence (AI) and Machine Learning
Artificial Intelligence (AI) has already generated a lot of buzz in the last decade, but it remains one of the new technology trends as the effects it has on our lives, work, and play are only felt at an early stage. AI is already known for its superiority in image and speech recognition, navigation apps, personal assistants for smartphones, carpooling, and much more.
In addition, AI will continue to be used to analyze interactions to discover underlying contexts and insights, to predict demand for services such as hospitals, to help authorities make better decisions about resource use, and to understand changing patterns of customer behavior Real-time, real-time revenue growth, and personalized experience enhancement by analyzing data in close proximity.
The AI souk will grow to $ 190 billion by 2025, with worldwide spending on cognitive and AI systems reaching over $ 57 billion in 2021. With the spread of AI in different sectors, new jobs are being created in development, programming, testing, support, and maintenance, to name a few. On the flip side, AI also offers some of the highest salaries, ranging from over $ 1.25,000 per year (machine learning engineer) to up to $ 145,000 per year (AI architect) today - making it the best the new technology trend that you have to watch out for!
Machine learning, the subset of AI, is also used in all industries, which creates an enormous need for qualified specialists. Forrester predicts that AI, machine learning, and computerization will make 9 percent of new jobs in the U.S. by 2025, including robotic surveillance professionals, data scientists, automation specialists, and content curators. This is another new technology trend that you need to consider too!
Robotic Process Automation
Like AI and machine learning, Robotic Process Automation (RPA) is another technology that automates jobs. RPA is the exploitation of software to mechanize business processes such as interpreting applications, processing transactions, handling data, and even answering emails. RPA automates repetitive tasks that used to be done.
Although Forrester Research estimates that RPA automation will jeopardize the livelihoods of 230 million or more knowledge workers or about 9 percent of the global workforce, RPA is also creating new jobs and transforming existing jobs. McKinsey notes that less than 5 percent of occupations can be fully automated, but about 60 percent can be partially automated.
For you as an Information technology professional looking to the prospect and trying to understand new technology trends, RPA offers numerous career opportunities, including developer, project manager, business analyst, solution architect, and consultant. And these jobs pay off well. An RPA developer can makeover £ 534,000 a year - this is the next technology trend to keep an eye out for!
Edge Computing
Cloud computing used to be a new technology trend and has become mainstream. The main players AWS (Amazon Web Services), Microsoft Azure, and Google Cloud Platform dominate the market. The acceptance of cloud computing continues to grow as more and more companies migrate to a cloud solution. However, it is no longer an emerging technology trend. Edge is.
As the amount of data businesses deal with continues to grow, in some situations they have recognized the shortcomings of cloud computing. Edge computing was designed to solve some of these problems and bypass the latency caused by cloud computing and move data to a data center for processing. It can exist if you will, “on the edge,” closer to where computing needs to take place. Because of this, edge computing can be used to process time-sensitive data at remote locations with limited or no connectivity to a central location. In such situations, frame computing can act as a small data center.
Edge computing will increase as the use of IoT (Internet of Things) devices increases. The global edge computing market is projected to reach $ 6.72 billion by 2022. And this new technology trend should only grow and not fewer and create different jobs, especially for software developers.
Quantum Computing
The next notable technology trend is quantum computing, a form of computing that uses quantum phenomena such as superposition and quantum entanglement. This amazing technology trend is also helping to prevent the spread of the coronavirus and developing potential vaccines as data can be easily queried, monitored, analyzed, and processed regardless of the source. Another area where quantum computing has applications is in banking and finance for managing credit risk for high-frequency trading and fraud detection.
Quantum computers are now many times faster than regular computers, and big brands like Splunk, Honeywell, Microsoft, AWS, Google, and many others are now involved in innovations in the quantum computing field. Revenues for the universal quantum computing market are probable to exceed $ 2.5 billion by 2029. To make a name for yourself in this new trending technology, you must have experience in quantum mechanics, linear algebra, probability, information theory, and machine learning.
Virtual Reality and Augmented Reality
The next notable technological trend is quantum computing, a form of computation that uses quantum phenomena such as quantum superposition and entanglement. This amazing technological trend is also helping prevent the spread of the coronavirus and develop potential vaccines, as data can be easily queried, monitored, analyzed, and processed regardless of the source. Another area in which quantum computing has applications in banking and finance for credit risk management for high-frequency trading and fraud detection.
Quantum computers are now several times faster than conventional computers, and big brands like Splunk, Honeywell, Microsoft, AWS, Google and many others are now involved in innovations in the field of quantum computers. Global quantum computing market revenues are expected to exceed $ 2.5 billion by 2029. To make a name for yourself in this new trending technology, you must have experience in quantum mechanics, linear algebra, probability, information theory, and machine learning.
Blockchain
Although most people think of blockchain technology in terms of cryptocurrencies like Bitcoin, blockchain provides security that is useful in many other ways. In the simplest case, blockchain can be described as data that you can only add, not remove, or change. Hence the term "chain" because you are creating a data chain. Not being able to change the previous blocks is what makes them so safe. In addition, blockchains are consensus-driven so that no one can take control of the data. With blockchain, you don't need a trusted third party to monitor or validate transactions.
Blockchain is being incorporated and implemented in several industries, and as the use of blockchain technology increases, so too does the need for qualified specialists. A bird's eye view of a blockchain developer who specializes in developing and implementing architectures and solutions using blockchain technology. The average annual salary for a blockchain developer is £ 469,000.
If you are fascinated by blockchain and its applications and want to start your career in this trending technology, this is the right time to start. To get started with blockchain, you must have hands-on experience with programming languages, the basics of OOPS, flat and relational databases, data structures, web app development, and networking.
Internet of Things
Another promising new technology trend is the IoT. Many “things” are now created with WiFi connectivity, which means they can be connected to the Internet and each other. Hence the Internet of Things or IoT. The Internet of Things is the future and has already made it possible to connect devices, household appliances, cars, and much more with data and exchange data via the Internet.
As consumers, we already use IoT and benefit from it. We can remotely lock our doors if we forget when to go to work and preheat our ovens on the way home from work while also tracking our fitness on our Fitbits. But companies have a lot to gain now and in the near future. The IoT can enable organizations to better secure, operate, and make better decisions when data is collected and analyzed. It can enable predictive maintenance, expedite medical care, improve customer service, and offer benefits we haven't even imagined.
And we are only at the beginning of this new technology trend: According to forecasts, around 50 billion of these IoT devices will be in use worldwide by 2030, creating a huge network of interconnected devices that includes everything from smartphones to kitchen appliances. Global spending on the Internet of Things (IoT) is projected to reach $ 1.1 trillion in 2022. New technologies like 5G are expected to drive market growth in the coming years.
If you want to get into this trending technology, you need to be familiar with the basics of information security, AI, and machine learning, networking, the hardware interface, data analysis, automation, and understanding of embedded systems, as well as having devices and design skills.
5G
The next technology trend following the IoT is 5G. Where 3G and 4G technologies have enabled us to surf the internet, use data-driven services, increase bandwidth for streaming on Spotify or YouTube, and much more, 5G services are expected to revolutionize our lives. by enabling services based on advanced technologies such as AR and VR, as well as cloud-based gaming services such as Google Stadia, NVidia GeForce Now, and many more. It is expected to be used in factories, HD cameras that will help improve safety and traffic management, smart grid control, and smart retail.
Almost every telecommunications company like Verizon, Tmobile, Apple, Nokia Corp., and Qualcomm is currently working on building 5G applications. 5G services are expected to roll out globally in 2021. By the end of 2021, more than 50 operators will offer services in around 30 countries. This is a new technology trend that you need to watch out for and also secure.
Cyber Security
Cybersecurity doesn't seem like an emerging technology since it's been around for a while, but it's evolving in the same way as other technologies. In part, this is because threats are always new. The malicious hackers trying to illegally access data will not give up anytime soon and will continue to find ways to tackle even the toughest security measures. In part, this is because new technologies are being adapted to improve security. As long as we have hackers, cybersecurity will remain a trending technology as it is constantly evolving to defend itself against these hackers.
As facts of the strong need for cybersecurity professionals, the number of cybersecurity jobs is growing three times faster than other tech jobs. Also, the need for adequate cybersecurity is so great that $ 6 trillion will be spent on cybersecurity worldwide by 2021.
You need to note that as challenging as the field may be, it also offers lucrative six-figure income. The roles can range from ethical hackers to security engineers to chief security officers and offer a promising career path for someone who wants to get into this evergreen environment and stick with the trending technology.
For future reference
Python Online Training
.
0 notes
Text
How to Upscale Video Content to 4K, 8K, and Beyond
For the past three months, I’ve been working on what I’ve named the Deep Space Nine Upscale Project (DS9UP). The goal of DS9UP is to create a new, much-improved version of the show by applying modern processing techniques to the original DVD source before using AI-based software to create a higher-resolution version of the show. It’s had me thinking about upscaling and upscalers in general. Upscaling isn’t a feature we talk about much, but how your TV handles it (or, alternately, how you use the capability on a PC) can have a significant impact on how you experience content.
The word “upscale” generically means “to improve the value or quality of something.” In the video and PC space, it’s almost always a reference to increasing the effective resolution of a piece of content. There is typically an understood difference between upscaling and native resolution. If you upscale a 1080p video into 4K, it means you are taking a 1080p input and using a combination of hardware and software to create a larger image. This upscaled image will not be identical to a native 4K signal, but it should offer a better picture than what was previously available on your 720p or 1080p television.
Keyword: “should.” Video scalar quality in TVs can vary widely between different product families. In some cases, you might be better off using a GPU to drive a picture than relying on the TV’s native rescaling capability, while other TVs have excellent upscalers. Manufacturers rarely disclose their upscaling hardware choices, but higher-end TVs should have improved upscaling capabilities. If you have a UHD Blu-ray player paired with an older or lower-quality 1080p or 4K TV, you might even get better results by running all video signals through the Blu-ray player rather than the television. Generally speaking, a good TV upscaler is considered to be as good or better than a GPU.
How the Scalar Sausage Gets Made
The most basic function of a video scaler is to take whatever image it receives — 480i, 720p, 1080p — and stretch it across the entire screen. Without this functionality, a 1080p signal would take up just a fraction of a 4K television’s display. This simple resizing is typically done by taking each individual 1080p pixel and creating four pixels out of it (remember, 4K is four times the pixels of 1080p).
But many TVs and Blu-ray players do more than just perform a simple 1:4 mapping. They also use video processing techniques to extrapolate what details ought to be present in the scene. How well this works depends on the type of content being
Image by DisplayNinja
In the image above, you can see how the upscaled 4K is much more nuanced than the simple 1:4 mapping in the second grid from the left. If you’re having trouble seeing the difference between the 1080p upscale and the native 4K, look at the left-side blocks in the very first row and the right-side blocks in the very last row. The native 4K image resolves into distinctly different colors than the 1080p image in R1C2 (That’s 1st row, 2nd column), R1C3, and R8C8. As the amount of available horsepower in televisions has improved, the quality and sophistication of their integrated upscalers have grown as well. Some modern TVs have sophisticated sharpening algorithms to reverse the blur created by upscaling and interpolation algorithms good enough to almost match the precision of a native 4K signal.
How does all this look with real content? A 2017 Rtings article can help answer that question. The first image below is from a 4K set displaying 1080p in upscaled 4K, while the second is native 1080p.
Image by Rtings
If you have trouble seeing a difference between the two images, open both of them in a new tab and focus your eyes near the center of the image. See the house with a brown roof near the center of the image, with three windows facing approximately south-southwest and a fourth pointed east? (All directional cues based on north being “up”, not the direction of the sunlight). Look at that specific spot in both images, and the roofs in the buildings immediately adjacent. The difference should jump out at you. In this case, even using TVs that date back to 2015, the 1080p upscale to 4K is better than the 4K image.
Image by Rtings
If you have an older TV or a budget 4K model, there’s one obvious method of improving your TV’s upscaling: Buy a better television. Unfortunately, it’s impossible to predict how well this will work without knowing exactly what you own now and what you plan to purchase to replace it. The older your current TV, the better the chances that a new set will deliver upgrades in all respects, but many of those improvements may have nothing to do with the way your upscaler handles <4K content.
If you aren’t happy with your current TV, can’t replace it at the moment, and happen to own a high-end Blu-ray or UHD player, you can also try running content through its upscaler rather than relying on the television to handle it. In some cases, a top-end UHD Blu-ray player may deliver a better experience than an entry-level 4K TV from a few years back. If you’re still using a DVD player to feed a picture to a mediocre 1080p or 4K panel when/if you play DVDs, and you can swap over to a high-end Blu-ray / UHD Blu-ray player instead, I’d try it. It may or may not help, but it definitely won’t hurt. What you’re trying to do here is route the signal through the upscaler that’ll give it the best quality kick.
Still need a higher-quality picture? You’re in luck.
Real-Time AI Processing
I haven’t tested the most recent Nvidia Shield myself, but there’s a demo you can actually play with on Nvidia.com to apply the effect the TV offers. Here’s a screenshot of the effect. I’ve positioned the slider over the lizard’s eye because it’s the easiest place to see the upscaler’s impact:
Left: Regular upscale Right: Nvidia Shield
Still not clear? Here’s an enlarged version of the same screenshot.
The image on the left is a traditional upscaler, the image on the right is Nvidia’s Shield when asked to scale up 720p or 1080p content to 4K (content below 720p is not supported for upscaling, at least not yet). The AI component of the upscaler obviously improves the overall image quality. In my experience with applications like TVEAI, this is a fair representation of the improvements that can be achieved.
Third-party reviews of the Shield agree. Slashgear writes that when it works, the effect is “fairly astonishing.” So far as I’m aware, the Shield is currently the only set-top or 2D box offering this kind of functionality.
Video Upscaling via Third-Party Software
Finally, there’s the option to use a third-party upscaler, like Topaz Video Enhance AI. I’ve made extensive use of TVEAI as part of the Deep Space Nine Upscale Project, and can confirm that the application is capable of yielding stunning results. One major limitation of TVEAI, however, is that it currently only supports Intel and Nvidia platforms for GPU-accelerated processing. CPU processing is available, but likely too slow to be all that useful.
Thanks to AI-based upscaling, historic sci-fi canards like the “Enhance” function are now a reality. It’s a truism in video encoding that no application on Earth can put data back where it never existed, and that’s still true today. The reason we can now “enhance” images to improve their clarity is that AI-based applications are capable of analyzing a video field and estimating what detail would exist if the image were in higher quality already.
One of the typical ways to train a video-enhancing application like TVEAI is to provide the neural net with the same image or video in both high and low quality. The neural net is then tasked with finding a way to make the low-quality source look as much like the high-quality source as possible. Instead of trying to teach a computer what lines and curves look like by providing painstaking examples, we’ve developed the ability to make the computer do the work of teaching itself. That’s why “Enhance” has gone from a complete joke to a plausible reality in a matter of a few years.
You may need to zoom to see subtle differences, but this isn’t an upscale comparison. It’s a comparison of performing my upscale work in separate applications and steps (left-side) versus ingesting the entire video workload into Topaz Video Enhance AI using the AVFS virtual file system and performing the entire operation at once. If you’re thinking “Wow, that’s a tiny difference, who would even care?” well, you may be right — but I wasn’t kidding when I said I was going to use the highest-quality source possible.
Applications like Topaz Video Enhance AI can cost several hundred dollars and they don’t run in real-time — the RTX 2080 appears capable of 90-110 frames per minute when upscaling 640×480 video to 2560×1920. The result of using these applications, however, is a vastly better picture than you’ll see from any other source.
DVD source on the left, upscale on the right.
I suspect we’ll see AI video processing and upscaling become more important over time as Intel, Nvidia, and AMD introduce their next-generation graphics processors. There’s an awful lot of old content orphaned on bad-quality source, and building specialized AI processors to fix it will likely consume some silicon for all of the concerned parties over the next few years.
Finally, I’ve embedded the current version of DS9UP’s opening credits for the show below. This is new footage that hasn’t appeared in a previous DS9UP article. There is no audio in this clip and you’ll need to raise the resolution to 4K, but I’ve created this file based on the DVD source with modifications in AviSynth, DaVinci Resolve, and Topaz Video Enhance AI.
youtube
Compare that to the actual credits ripped from the DVD and uploaded to YouTube by yours truly:
youtube
This is what the credits look like if you watch them on the source DVDs. The improvements from this version to my own are not small.
Upscalers are amazing and only getting better. That’s true no matter how you consume content or what technology you use to do it. Depending on the shows you like and how much time you want to sink into the project, there are tremendous improvements to be had… or you can just wait a few years, and buy a better TV. You can read more about DS9UP at the links below.
Now Read:
Deep Space Nine Upscale Project Update: Variable Frame Rate DVDs Can Burn in Hell
Deep Space Nine Upscale Project Update: ‘Sacrifice of Angels’
Upscaling Star Trek: Deep Space Nine Using Topaz Video Enhance AI
PCMag: The Best TVs for 2020
from ExtremeTechExtremeTech https://www.extremetech.com/extreme/310029-how-to-upscale-video-content-to-4k-8k-and-beyond from Blogger http://componentplanet.blogspot.com/2020/05/how-to-upscale-video-content-to-4k-8k.html
0 notes
Text
2019 Top Apps: Personal Finance, Banking, Hotels, and Lodging
Mobile and in-app engagement is increasingly important for brands. But which brands are actually doing this really well, personalizing their user’s experiences, and seeing tangible results in the process?
Others have attempted to draw your attention to the best apps out there, including PCMag (helpfully broken down by category) and Digital Trends (helpfully broken down by device, i.e. Android vs. iOS, and by month).
We’ve ranked the top 40 apps using our “Stages of Personalization” model, which takes into account:
Audience
Message
Timing
We’ve previously outlined how you can move from Stage 1, which requires you to move away from broadcast messaging, to Stage 2.
When we analyze these apps, it’s through a prism of how effectively they personalize their in-app and overall mobile experience for users. We will often link to their app review page when linking the name of the brand, in case you want to read a slate of reviews before adding it to your phone.
We know looking at all 40 apps and the background/reasoning at once might have been a lot, so we broke this into two posts of 20 apps each. The first 20 apps come from personal finance/banking and hotels/lodging. The next 20 apps (later this week) will be from retail, quick-service food (think Chipotle), entertainment, and wireless/telecommunications.
Finance, Banking, Personal Finance Apps
Coinbase: This is basically the app for people who have heard of crypto currency, are not exactly sure what that is, but know they might need to understand it as they get further into adulthood. You can view prices and charts for different cryptocurrencies and broadly understand the market. The push notifications are targeted to news and information about what’s happening and what you should consider doing with your own money.
Bank of America: Mobile banking is increasingly a huge area for long-standing financial institutions, and following a few app redesigns, BofA has seen huge jumps in mobile users. (Their AI application, Erica, already has 6M+ users.) During the fourth quarter of 2018, mobile users logged in 1.5 billion times. The 2017 redesign was largely focused on personalization, allowing users to choose the up-front functionalities and notifications most important to their stage in life.
Mint: Long a favorite of the millennial world, Mint is also inherently personalized: You’re looking at your budget, your spending categories, your areas of potential improvement, your credit score, and your bills. Every message you’re receiving from them is about ways to spend money better. In most of America, real wages haven’t budged since 1978, but the cost of goods and services continues to increase. You need to stay on top of your money in order to make those bigger life purchases and decisions, and Mint is a beautifully-personalized way to do that.
Scotiabank: They actually partnered with Sensibill a few years ago to personalize the financial experience, including having access to customer purchase history at the item level. That allowed them to offer targeted banking products based on what their customers were already spending money on. Easiest example: lots of money on things you need before a baby? They will personalize triggers to College Savings Plans. They know their audience, the right message, and the right timing. That’s how the formula works best.
Qapital: This app is about bringing gamification to budgeting. Members can invest with personalized portfolios, and set rules and triggers for their spending. We all sometimes need that Thursday happy hour because it’s Baby Friday, but can we actually afford it? This is a way to know -- and you’re quickly able to learn in-app via your existing information and rules.
Capital One: They have cool ads with mascots, for sure. And they sponsor a lot of college football games. But how’s the app itself? It’s actually great. It’s been No. 1 in customer satisfaction among mobile banking apps for the last two years via JD Power and Associates, and here’s a cool little feature: You can customize credit cards with family pictures and the like right in-app. Capital One also sends personalized notifications for way more than fraud; for example, they’ll tell you when your Netflix bill goes up.
Venmo: Not much we need to say here. Popular app and it brings the “friend graph” (people you actually know in your day-to-day life) into your finances, complete with emojis and the like. Sharing money feels easy and it feels like a big game, which is an awesome user experience and highly-personalized psychologically because the “What’s it for?” (the expense) is often based on jokes between you and your friends, strengthening those bonds and making you feel good about the app.
Wells Fargo: Wells Fargo also added AI recently (early 2018), which provides increasingly-personalized banking insights. They’ll push message you on recurring bills that have changed, and also (somewhat shamefully) encourage the transfer of money from savings to checking to avoid any potential overdraft. The app also offers personalized financial guidance in-app.
HSBC: HSBC has also tested smartwatches as a way to cut customer wait times in-branch, and their Connected Money app, which is currently available mostly in the UK, has gained 300,000 new users since the start of 2019. Connected Money is a money management app that simplifies expense tracking and personalizes reports and suggestions for users.
Chase Mobile: In 2018, Chase’s CMO actually said “you can kiss traditional marketing goodbye,” noting that digital and mobile channels brought about a hyper-targeting, hyper-personalized rise. Chase has been near the forefront of mobile financial personalization for 3-4 years, including push notifications for any number of situations ranging from overdraft to bill changes to bill pay prompts to easier sharing of money with friends.
Hotels and Lodging
AirBNB: This is the grand-daddy of mobile disruption market threat apps. In fact, we just wrote about this recently: “By March 2019, consumers spent more on AirBNB lodging than on Hilton, meaning AirBNB -- founded in August 2008 -- owns about 20% of the world's lodging market in 11 years.”
Caesars Rewards: This is a great app for exclusive mobile offers at their resorts in Vegas, Atlantic City, New Orleans, and worldwide. The essence of modern marketing is “turn data into loyalty,” and that’s what Caesars aims to do with this app. If they see a specific customer always playing certain tables, or visiting certain in-casino bars, or requesting similar rooms, they can customize their next experience and even trigger the purchase of the experience with mobile offers. In-app, you can keep track of your points, rewards, and offers. Caesars themselves have noted that “tens of millions of people” constitute the middle of the gaming and lodging market -- so not the high-rollers -- and those people need to be taken care of well and keep coming back. The app and its personalization is a core tenet of that strategy.
Kayak: We put this under lodging because while many consider it a flight-purchase app primarily, it has great hotel selection as well. They even partnered with Amazon Echo for voice-powered hotel booking! Talk about making what can be a painful process into one that’s fun for the family.
Hyatt Hotels: This one has all the normal features you’d expect from a hotel chain around booking, viewing bills, upgrading, etc. But you can also request items to your room, book Uber to/from hotel, and communicate with on-site staff via Facebook Messenger and Twitter in-app. As they get increasing data on your travel preferences, Hyatt subsequently will personalize offers around your next lodging needs.
Marriott: Little bit freaky, maybe, but Siri can unlock your door in this app. Marriott’s got an ambitious growth plan of 1,700 new hotels in the next three years, and to consistently fill those hotels, they need uber-personalized experiences for their guests. They realize that power comes from mobile. And in fact, in mid-2018, one publication noted they were “changing the hotel game with personalization,” noting:
“When someone checks into one of our hotels, we want to be able to wow them,” said Linnartz. “We want to be able to say, ‘Hello, Henry — welcome! We know you just flew in from Dubai, where you stayed at our property last night. We’ve put a lovely amenity in your room to help you get over your jet lag. We also know you love running, so we’ve mapped out a great route that you might want to explore.’ Our digital platform is the tool that allows us to have all that information in the hands of our front-desk staff, so we can pull off that kind of experience.”
Hilton Worldwide: The Hilton Honors app is at the center of Hilton’s new focus on “connected rooms,” which can bring in personalized streaming (think Netflix), room opening in-app, and even preferred temperature setting. As we’ve said before, intelligent digital and mobile engagement begins with four pillars, and strategy and data are the first two legs. Hilton is leveraging both to create personalized, “I-want-to-come-back” experiences for their guests.
Choice Hotels: We worked on a mobile marketing playbook with them a bit ago, so check that out.
MGM Resorts: Like competitors, they’ve launched mobile check-in at 13 Las Vegas locations, and The Drum recently described them as going “Vegas-style big on personalization,” including boosting the M-Life Rewards program to include more customer data. That data becomes the backbone of offers, entertainment, room options, transportation, and more that guests can be pinged about in-app or via push.
LateRooms: This is a way to find hotel deals. They offer exclusive mobile app rates, for one. Many of their users are inherently thinking this will be “one and done,” i.e. “Oh, I need a quick hotel room in Boston,” but LateRooms engages and personalizes based on a limited amount of data from the initial hotel selection and a user experience to keep the user returning with additional hotel needs.
Expedia: Expedia is an apex predator in the bundled-travel world, and for much of 2019, they’ve been discussing the importance of personalization for hotels. Voice bookings, cancellations, and rewards are now tied to Google Assistant and in-app, and their Scratchpad concept allows users to build a perfect trip (across all Expedia’s offerings) without having to jump between 40+ travel sites, as can be common. Everything you’ve saved and researched will be there when you return. All your personal info is there, and Expedia can make recommendations based on what you’re thinking so far.
Stay tuned for the next round up of great apps! We’ll cover quick service restaurants, retail, wireless companies, and entertainment.
2019 Top Apps: Personal Finance, Banking, Hotels, and Lodging published first on https://spyadvice.tumblr.com/
0 notes
Text
Hybrid Cloud, IoT, Blockchain, AI/ML, Containers, and DevOps… Oh My!
When it rains it pours. It seems regarding Enterprise IT technology innovation, it is common for multiple game-changing innovations to hit the street simultaneously. Yet, if ever the analogy of painting the car while its traveling down the highway is suitable, it’s this time. Certainly, you can take a wait and see approach with regard to adoption, but given the association of these innovations toward greater business agility, you’d run the risk of falling behind your competitors.
Let’s take a look at what each of these innovations mean for the enterprise and their associated impact to the business.
First, let’s explore the synergies of some of these innovations. Certainly, each innovation can and does have a certain value by themselves, however, when grouped they can provide powerful solutions to help drive growth and new business models.
Hybrid Cloud + IoT + AI/ML. IoT produces a lot of exhaust (data) that results in two primary outcomes: a) immediate analysis resulting in a directive to the IoT endpoint (the basis for many smartX initiatives) or b) collect and analyze looking for patterns. Either way, the public cloud going to offer the most economic solution for IoT services, data storage and the compute and services supporting machine learning algorithms.
IoT + Blockchain. Blockchains provide immutable entries stored in a distributed ledger. When combined with machine-driven entries, for example from an IoT sensor, we have non-refutable evidence. This is great for tracing chain of custody, not just law enforcement, but perishables, such as meat and plants.
Containers, DevOps and agile software development. These form the basis for delivering solutions like those above quickly and economically bringing allowing the value to be realized rapidly by the business.
There are businesses that are already using these technologies to deliver new and innovative solutions, many of which have been promoted in the press and at conferences. While these stories illustrate strong forward momentum, they also tend to foster a belief that these innovations have reached a sufficient level of maturity, such that the solution is not susceptible to lack of availability. This is far from the case. Indeed, these innovations are far from mainstream.
Let’s explore what adoption means to IT and the business for these various innovations.
Hybrid Cloud
I specifically chose hybrid cloud versus public cloud because it represents an even greater amount of complexity to enterprise IT than public cloud alone. It requires collaboration and integration between organizations and departments that have a common goal but very different approaches to achieving success.
First, cloud is about managing and delivering software services, whereas the data center is charged with delivering both infrastructure and software services. However, the complexity and overhead of managing and delivering reliable and available infrastructure overshadows the complexity of software services, resulting in the latter often receiving far less attention in most self-managed environments. When the complexity surrounding delivery of infrastructure is removed, the operations team can focus solely on delivery and consumption of software services.
Security is always an issue, but the maturation process surrounding delivery of cloud services by the top cloud service providers means that it is a constantly changing environment. With security in the cloud, there is no room for error or the applications could be compromised. This, in turn, requires that after each update to the security controls around a service the cloud team (architects, developers, operations, etc.) must educate themselves on the implications of the change and then assess how that change may affect their production environments. Any misunderstanding of these updates and the environment could become vulnerable.
Hybrid cloud also often means that the team must retain traditional data center skills while also adding skills related to the cloud service provider(s) of choice. This is an often overlooked aspect of assessing cloud costs. Moreover, highly-skilled cloud personnel are still difficult to attract and usually demand higher than market salaries. You could (and should) upskill your own staff, but you will want a few experts as part of the team on-the-job training for public cloud, as unsecured public cloud may lead to compromising situations for businesses.
Internet-of-Things (IoT)
The issue with IoT is that it is not one single thing, but a complex network of physical and mechanical components. In a world that has been moving to a high degree of virtualization, IoT represents a marked shift back toward data center skills with an emphasis on device configurations, disconnected states, limitations on size of data packets being exchanged, and low-memory code footprints. Anyone who was around during the early days of networking DOS PC’s will be able to relate to some of the constraints.
As with all things digital, security is a highly-complex topic with regard to IoT. There are so many layers within an IoT solution that welcomes compromise: the sensor, the network, the edge, the data endpoint, etc. As many of the devices participating in an IoT network may be resource constrained there’s only so much overhead that can be introduced for security before it impairs the purpose.
For many, however, when you say IoT they immediately only see the analytical aspects associated with all the data collected from the myriad of devices. Sure, analyzing the data obtained from the sensor mesh and the edge devices can yield an understanding of the way things worked in ways that were extremely difficult with the coarse-grained telemetry provided by these devices. For example, a manufacturing device that signaled issues with a low hum prior to the use of sensors that now reveal that in tandem with the hum, there’s also a rise in temperature and an increase in vibration. With a few short months of collecting data, there’s no need to even wait for the hum, the data will indicate the beginning of a problem.
Of course, the value discussed in the prior paragraph can only be expressed if you have the right skilled individuals across the entire information chain. Those able to modify or configure endpoint devices to participate in an IoT scenario, the cybersecurity and infosec experts to limit potential issues due to breach or misuse, and the data scientists capable of making sense of the volumes of data being collected. Of course, if you haven’t selected the public cloud as the endpoint for your data, you also then have the additional overhead of managing network connectivity and storage capacity management associated with rapidly growing volumes of data.
Artificial Intelligence and Machine Learning (AI/ML)
If you can harness the power of machine learning and AI you gain insights into your business and industry in a way that was very difficult up until recently. While this is seemingly a simple statement, that one word “harness” is loaded with complexity. First, these technologies are most successful when operating against massive quantities of data.
The more data you have the more accurate the outcomes. This means that it is incumbent upon the business to a) find, aggregate, cleanse and store the data to support the effort, b) formulate a hypothesis, c) evaluate the output of multiple algorithms to determine which will best support the outcome you are seeking—e.g. predictive, trends, etc.—and d) create a model. This all equates to a lot of legs to get the job done. Once your model is complete and your hypothesis proven, the machine will do most of the work from there on out but getting there requires a lot of human knowledge engineering effort.
A point of caution, make business decisions using the outcome of your AI/ML models when you have not followed every one of these steps and then qualified the outcome of the model against the real world at least two times.
Blockchain
Touted as the technology that will “change the world,” yet outside of cryptocurrencies, blockchain is still trying to establish firm roots within the business world. There are many issues with blockchain adoption at the moment, the most prevalent one is velocity of change. There is no single standard blockchain technology.
There are multiple technologies each attempting to provide the foundation for trusted and validated transactional exchange without requiring a centralized party. Buying into a particular technology at this point in the maturity curve, will provide insight into the value of blockchain, but will require constant care and feeding as well as the potential need to migrate to a completely different network foundation at some point in the future. Hence, don’t bet the farm on the approach you choose today.
Additionally, there are still many outstanding non-technical issues that blockchain value is dependent upon, such as the legality of blockchain entries as a form of non-repudiation. That is, can a blockchain be used as evidence in a legal case to demonstrate intent and validation of agreed upon actions? There are also issues related to what effect use of a blockchain may have on various partnering contracts and credit agreements, especially for global companies with GDPR requirements.
Finally, is the value of the blockchain a large enough network to enforce consensus? Who should host these nodes? Are the public networks sufficient for business or is there a need for a private network shared among a community with common needs?
Containers, DevOps, & Agile SDLC
I’ve lumped these three innovation together because unlike the others, they are more technological in nature and carry elements of the “how” more so than the “what”. Still, there is a significant amount of attention being paid to these three topics that extend far outside the IT organization due to their association with enabling businesses to become more agile. To wit, I add my general disclaimer and word of caution, the technology is only an enabler, it’s what you do with it that might be valuable or may have an opposite effect.
Containers should be the least impactful of these three topics, as it’s simply another way to use compute resources. Containers are smaller and more lightweight than virtual machines but still facilitate a level of isolation between what is running in the container and what is running outside the container. The complexity arises from moving processes from bare metal and virtual machines into containers as containers leverage machine resources differently than the aforementioned platforms.
While it’s fairly simple to create a container, getting a group of containers to work together reliably can be fraught with challenges. This is why container management systems have become more and more complex over time. With the addition of Kubernetes, businesses effectively needs the knowledge of data center operations in a single team. Of course, public cloud service providers now offer managed container management systems that reduce the requirements on such a broad set of knowledge, but it’s still incumbent on operations to know how to configure and organize containers from a performance and security perspective.
DevOps and Agile Software Development Lifecycle (SDLC) really force the internal engineering teams to think and act differently if they are transitioning from traditional waterfall development practices. Many businesses have taken the first step of this transition by starting to adopt some Agile SDLC practices. However, because of the need for retraining, hiring, and support of this effort, the interim state many of these businesses are in have been called “wagile” meaning some combination of waterfall and agile.
As for DevOps, the metrics have been published regarding the business value of becoming a high-performing software delivery and operations organization. In this age of “software is eating the world” can your organization ignore DevOps and if not ignore take years to transition? You will hear stories from businesses that have adopted DevOps and Agile SDLC and made great strides in reducing latency, increasing the number of releases they can make in a given time period, and deploying new capabilities and functions to production at a much faster rate with fewer change failures. Many of these stories are real, but even in these businesses, you will still find pockets where there is no adoption and they still follow a waterfall SDLC that take ten months to get a single release into production.
Conclusion
Individually, each of these innovations requires trained resources, funding, and can be difficult to move beyond proof-of-concept to completely operationalized production outcomes. Taken in combination, on top of existing operational pressures, these innovations can rapidly overwhelm even the most adept enterprise IT organization. Even in cases where there is multi-modal IT and these innovations are occurring outside the path of traditional IT, existing IT knowledge and experience will be required to support. For example, if you want to analyze purchasing trends for the past five years, you will need to support of the teams responsible for your financial systems.
All this leads to the really big question, how should businesses go about absorbing these innovations? The pragmatic answer is of course introduce those innovations related to a specific business outcome. However, as stated, waiting to introduce some of these innovations could result in losing ground to competition. This means that you may want to introduce some proof-of-concept projects especially around AI/ML and Agile SDLC with IoT and Blockchain projects where they make sense for your business.
from Gigaom https://gigaom.com/2019/01/11/hybrid-cloud-iot-blockchain-ai-ml-containers-and-devops-oh-my/
0 notes
Text
Hybrid Cloud, IoT, Blockchain, AI/ML, Containers, and DevOps… Oh My!
When it rains it pours. It seems regarding Enterprise IT technology innovation, it is common for multiple game-changing innovations to hit the street simultaneously. Yet, if ever the analogy of painting the car while its traveling down the highway is suitable, it’s this time. Certainly, you can take a wait and see approach with regard to adoption, but given the association of these innovations toward greater business agility, you’d run the risk of falling behind your competitors.
Let’s take a look at what each of these innovations mean for the enterprise and their associated impact to the business.
First, let’s explore the synergies of some of these innovations. Certainly, each innovation can and does have a certain value by themselves, however, when grouped they can provide powerful solutions to help drive growth and new business models.
Hybrid Cloud + IoT + AI/ML. IoT produces a lot of exhaust (data) that results in two primary outcomes: a) immediate analysis resulting in a directive to the IoT endpoint (the basis for many smartX initiatives) or b) collect and analyze looking for patterns. Either way, the public cloud going to offer the most economic solution for IoT services, data storage and the compute and services supporting machine learning algorithms.
IoT + Blockchain. Blockchains provide immutable entries stored in a distributed ledger. When combined with machine-driven entries, for example from an IoT sensor, we have non-refutable evidence. This is great for tracing chain of custody, not just law enforcement, but perishables, such as meat and plants.
Containers, DevOps and agile software development. These form the basis for delivering solutions like those above quickly and economically bringing allowing the value to be realized rapidly by the business.
There are businesses that are already using these technologies to deliver new and innovative solutions, many of which have been promoted in the press and at conferences. While these stories illustrate strong forward momentum, they also tend to foster a belief that these innovations have reached a sufficient level of maturity, such that the solution is not susceptible to lack of availability. This is far from the case. Indeed, these innovations are far from mainstream.
Let’s explore what adoption means to IT and the business for these various innovations.
Hybrid Cloud
I specifically chose hybrid cloud versus public cloud because it represents an even greater amount of complexity to enterprise IT than public cloud alone. It requires collaboration and integration between organizations and departments that have a common goal but very different approaches to achieving success.
First, cloud is about managing and delivering software services, whereas the data center is charged with delivering both infrastructure and software services. However, the complexity and overhead of managing and delivering reliable and available infrastructure overshadows the complexity of software services, resulting in the latter often receiving far less attention in most self-managed environments. When the complexity surrounding delivery of infrastructure is removed, the operations team can focus solely on delivery and consumption of software services.
Security is always an issue, but the maturation process surrounding delivery of cloud services by the top cloud service providers means that it is a constantly changing environment. With security in the cloud, there is no room for error or the applications could be compromised. This, in turn, requires that after each update to the security controls around a service the cloud team (architects, developers, operations, etc.) must educate themselves on the implications of the change and then assess how that change may affect their production environments. Any misunderstanding of these updates and the environment could become vulnerable.
Hybrid cloud also often means that the team must retain traditional data center skills while also adding skills related to the cloud service provider(s) of choice. This is an often overlooked aspect of assessing cloud costs. Moreover, highly-skilled cloud personnel are still difficult to attract and usually demand higher than market salaries. You could (and should) upskill your own staff, but you will want a few experts as part of the team on-the-job training for public cloud, as unsecured public cloud may lead to compromising situations for businesses.
Internet-of-Things (IoT)
The issue with IoT is that it is not one single thing, but a complex network of physical and mechanical components. In a world that has been moving to a high degree of virtualization, IoT represents a marked shift back toward data center skills with an emphasis on device configurations, disconnected states, limitations on size of data packets being exchanged, and low-memory code footprints. Anyone who was around during the early days of networking DOS PC’s will be able to relate to some of the constraints.
As with all things digital, security is a highly-complex topic with regard to IoT. There are so many layers within an IoT solution that welcomes compromise: the sensor, the network, the edge, the data endpoint, etc. As many of the devices participating in an IoT network may be resource constrained there’s only so much overhead that can be introduced for security before it impairs the purpose.
For many, however, when you say IoT they immediately only see the analytical aspects associated with all the data collected from the myriad of devices. Sure, analyzing the data obtained from the sensor mesh and the edge devices can yield an understanding of the way things worked in ways that were extremely difficult with the coarse-grained telemetry provided by these devices. For example, a manufacturing device that signaled issues with a low hum prior to the use of sensors that now reveal that in tandem with the hum, there’s also a rise in temperature and an increase in vibration. With a few short months of collecting data, there’s no need to even wait for the hum, the data will indicate the beginning of a problem.
Of course, the value discussed in the prior paragraph can only be expressed if you have the right skilled individuals across the entire information chain. Those able to modify or configure endpoint devices to participate in an IoT scenario, the cybersecurity and infosec experts to limit potential issues due to breach or misuse, and the data scientists capable of making sense of the volumes of data being collected. Of course, if you haven’t selected the public cloud as the endpoint for your data, you also then have the additional overhead of managing network connectivity and storage capacity management associated with rapidly growing volumes of data.
Artificial Intelligence and Machine Learning (AI/ML)
If you can harness the power of machine learning and AI you gain insights into your business and industry in a way that was very difficult up until recently. While this is seemingly a simple statement, that one word “harness” is loaded with complexity. First, these technologies are most successful when operating against massive quantities of data.
The more data you have the more accurate the outcomes. This means that it is incumbent upon the business to a) find, aggregate, cleanse and store the data to support the effort, b) formulate a hypothesis, c) evaluate the output of multiple algorithms to determine which will best support the outcome you are seeking—e.g. predictive, trends, etc.—and d) create a model. This all equates to a lot of legs to get the job done. Once your model is complete and your hypothesis proven, the machine will do most of the work from there on out but getting there requires a lot of human knowledge engineering effort.
A point of caution, make business decisions using the outcome of your AI/ML models when you have not followed every one of these steps and then qualified the outcome of the model against the real world at least two times.
Blockchain
Touted as the technology that will “change the world,” yet outside of cryptocurrencies, blockchain is still trying to establish firm roots within the business world. There are many issues with blockchain adoption at the moment, the most prevalent one is velocity of change. There is no single standard blockchain technology.
There are multiple technologies each attempting to provide the foundation for trusted and validated transactional exchange without requiring a centralized party. Buying into a particular technology at this point in the maturity curve, will provide insight into the value of blockchain, but will require constant care and feeding as well as the potential need to migrate to a completely different network foundation at some point in the future. Hence, don’t bet the farm on the approach you choose today.
Additionally, there are still many outstanding non-technical issues that blockchain value is dependent upon, such as the legality of blockchain entries as a form of non-repudiation. That is, can a blockchain be used as evidence in a legal case to demonstrate intent and validation of agreed upon actions? There are also issues related to what effect use of a blockchain may have on various partnering contracts and credit agreements, especially for global companies with GDPR requirements.
Finally, is the value of the blockchain a large enough network to enforce consensus? Who should host these nodes? Are the public networks sufficient for business or is there a need for a private network shared among a community with common needs?
Containers, DevOps, & Agile SDLC
I’ve lumped these three innovation together because unlike the others, they are more technological in nature and carry elements of the “how” more so than the “what”. Still, there is a significant amount of attention being paid to these three topics that extend far outside the IT organization due to their association with enabling businesses to become more agile. To wit, I add my general disclaimer and word of caution, the technology is only an enabler, it’s what you do with it that might be valuable or may have an opposite effect.
Containers should be the least impactful of these three topics, as it’s simply another way to use compute resources. Containers are smaller and more lightweight than virtual machines but still facilitate a level of isolation between what is running in the container and what is running outside the container. The complexity arises from moving processes from bare metal and virtual machines into containers as containers leverage machine resources differently than the aforementioned platforms.
While it’s fairly simple to create a container, getting a group of containers to work together reliably can be fraught with challenges. This is why container management systems have become more and more complex over time. With the addition of Kubernetes, businesses effectively needs the knowledge of data center operations in a single team. Of course, public cloud service providers now offer managed container management systems that reduce the requirements on such a broad set of knowledge, but it’s still incumbent on operations to know how to configure and organize containers from a performance and security perspective.
DevOps and Agile Software Development Lifecycle (SDLC) really force the internal engineering teams to think and act differently if they are transitioning from traditional waterfall development practices. Many businesses have taken the first step of this transition by starting to adopt some Agile SDLC practices. However, because of the need for retraining, hiring, and support of this effort, the interim state many of these businesses are in have been called “wagile” meaning some combination of waterfall and agile.
As for DevOps, the metrics have been published regarding the business value of becoming a high-performing software delivery and operations organization. In this age of “software is eating the world” can your organization ignore DevOps and if not ignore take years to transition? You will hear stories from businesses that have adopted DevOps and Agile SDLC and made great strides in reducing latency, increasing the number of releases they can make in a given time period, and deploying new capabilities and functions to production at a much faster rate with fewer change failures. Many of these stories are real, but even in these businesses, you will still find pockets where there is no adoption and they still follow a waterfall SDLC that take ten months to get a single release into production.
Conclusion
Individually, each of these innovations requires trained resources, funding, and can be difficult to move beyond proof-of-concept to completely operationalized production outcomes. Taken in combination, on top of existing operational pressures, these innovations can rapidly overwhelm even the most adept enterprise IT organization. Even in cases where there is multi-modal IT and these innovations are occurring outside the path of traditional IT, existing IT knowledge and experience will be required to support. For example, if you want to analyze purchasing trends for the past five years, you will need to support of the teams responsible for your financial systems.
All this leads to the really big question, how should businesses go about absorbing these innovations? The pragmatic answer is of course introduce those innovations related to a specific business outcome. However, as stated, waiting to introduce some of these innovations could result in losing ground to competition. This means that you may want to introduce some proof-of-concept projects especially around AI/ML and Agile SDLC with IoT and Blockchain projects where they make sense for your business.
0 notes
Text
Original post from SC Magazine
AI’s value on the endpoint still a work in progress, but it’s improving
AI is great for solving yesterday’s endpoint attacks, but the jury is still out on solving tomorrow’s. Esther Shein explains.
Today it is almost impossible to talk about cybersecurity without someone turning the discussion to artificial intelligence (AI). Sometimes it is appropriate, sometimes not. The trouble is, AI has become the go-to acronym for everything from threat intelligence to data protection to picking your next password. The problem is, when so many security pros bandy about AI as the end all, be all of security, the waters get muddy and the truth becomes harder to see.
Ask Tufts Medical Center CISO Taylor Lehmann about his use of AI platforms to protect cloud-based systems and he will tell you he is both ahead of the curve and behind it compared to other hospitals.
“It’s sort of unavoidable right now — anyone looking to improve their security posture, which is everyone — is inundated with products and services selling AI solutions,’’ Lehmann notes. “You can’t buy anything today without AI embedded.” But, he adds, “Responsible security officials don’t buy products but form a strategy” first. For Lehmann, that means striking a balance between the need to keep costs low while implementing security and threat protection offerings “that don’t require us to hire a bunch of people to run.”
Tufts Medical Center, part of a seven-hospital consortium in eastern Massachusetts, has a solid security infrastructure and Lehmann’s team has visibility into what is running on the network, he says. Right now, Tufts is “investing heavily in building an insights-out capability for security. Where we’re behind is in getting a better hold on third parties we share information with.”
The challenge, Lehmann says, has been identifying insights from within the data: Where is it going, to whom, the volume and the role of vendors in the care delivery process as it moves off the network. With an increasing amount of data being moved to the cloud and third-party providers, can AI help secure endpoints? Although the medical system is only in the early stages of using AI in the cloud, so far, he says, the answer is yes.
“We see the value in investing in AI, and we think there’s more opportunities for us to increase our use of AI that will make our lives easier and reduce the costs of the medical system and improve the security of our medical system,” he says. When your endpoints extend beyond the network and into the cloud, however, the obligation for securing data and applications becomes a shared responsibility, Lehmann stresses.
“When you put data in the cloud you’re sharing responsibility with someone else to protect it,” he says. “Where it’s our role, we’re using network-based and endpoint-based AI to do that. It’s important that our vendors do the same.”
AI on the endpoints today
Many others are also banking on AI to secure endpoints. The cloud endpoint protection market size was $910 million in 2017, and is projected to exceed $1.8 billion by 2023, at a compound annual growth rate of 12.4 percent, according to Markets and Markets Research. “The growing need for effective protection against cyberattacks on endpoints is expected to drive the market,” the firm notes.
Antivirus and malware detection technologies remain a moving target and the volume of new malware and attack techniques continues to grow. Couple that with the increasing volume of data being moved to endpoints like the cloud, and “and it’s clear that scaling these products to deal with such speed and volume requires a heavy investment in AI-like capabilities,” notes the Gartner report Lift the Veil on AI’s Never-Ending Promises of a Better Tomorrow for Endpoint Protection.
Nearly every day there are eye-catching headlines about how AI will transform everything from data management and backups to customer service and marketing, not to mention every single vertical industry. Heck, it even promises to change the economy — and deliver a better cup of coffee.
But in the rush to use AI components for endpoint protection, it is important to look beyond the hype, security experts insist.
Almost all endpoint protection platforms today use some data analysis techniques (such as machine learning, neural networks, deep learning, Naive Bayes Classifiers or natural language processing), the Gartner report states. They are easy to use and “require little to no understanding of or interaction with their AI components … However, it is critical that SRM (security and risk management) leaders avoid dwelling on specific AI marketing terms and remember that results are what counts.”
The Forrester report Mobile Vision 2020 is projecting that many organizations will be using AI and cognitive computing to generate business and security insights from unified endpoint data by 2020.
Forty-six percent of respondents to a 2017 survey said they anticipate the amount of endpoint data they collect will increase between 1 percent and 49 percent over the next three years, while 50 percent are bracing themselves for growth of 50 percent or more, according to the Forrester study.
“Organizations can gain significant intelligence from endpoint data, particularly for threat detection and remediation purposes,” the report says.
Security experts and enterprises that have started utilizing AI systems to protect data and apps in the cloud say that the technology certainly has merit but is not yet the panacea for defending endpoints.
“I think the hype is very, very dangerous and … I’m really worried, and don’t believe the hype will live up to everything it promises, but [AI is] very good for certain things,” observes Johan Gerber, executive vice president of the Security and Decision Products for Enterprise Security Solutions team at Mastercard. Gerber is based in St. Louis.
The credit card company acquired an AI software platform in 2017 to help it expand its ability to detect and prevent fraud and monitor the network, to enhance the security of customer information, Gerber says.
Since then, “we’ve been able to increase our fraud detection by 50 percent and decrease our false positives by 40 percent, so the application of advanced AI has really helped us in this use case.”
Gerber says he is “very excited about the potential of AI, and we’re using it every day and, in my world, it’s living up to promise and doing a tremendous amount for us.”
Mastercard is building models using a combination of neural networks and decision trees, as well as some AI open libraries. But Gerber says the “hybrid approach” is best when it comes to securing endpoints.
“I don’t believe in silver bullets; you need to have a multilayered approach … and we have an interesting mix of true machine learning supervised and unsupervised learning to help us know when it’s an attack we’ve seen before and an attack we haven’t seen before,’’ he says. “You need to look at the specific problem you’re going to solve and figure out whether AI will get there. The notion it will solve everything is dangerous .”
For AI and machine learning to be effective at securing endpoints, you have to have the right data and the right model, he says. “Machine learning learns from previously known patterns so [there is a] risk of it not being able to find anything it hasn’t seen yet. You teach the model and then say, ‘Figure it out using algorithms.’ I will not trust AI around securing data in the cloud; I will rely on a layered approach.”
That sentiment is shared by Zachary Chase Lipton, an assistant professor of business technologies at Carnegie Mellon University, who says a lot of people discuss AI without knowing what they are actually talking about. “The term is being used like an intellectual wild card,’’ he says.
People get excited about using machine learning algorithms to recognize suspicious traffic patterns that are predictive of previous security incidents, Chase Lipton says. The model has potential, he adds. But the catch with using pattern recognition is that “you make a giant assumption.”
When people make what Chase Lipton calls an “inductive assumption;” utilizing different types of data to say, “This is unkosher traffic on your network,” there is a chance they might not have all the information they need, or even the right information, he notes.
While machine learning might predict a pattern in one instance accurately, “that machine learning model could break” in another, he continues.
“With security, you’re dealing defensively with an adversary who’s actively trying to circumvent the system,’’ he says, when you rely on machine learning to do pattern recognition to try and protect a system. “People writing malware have a strong incentive to change what they’re doing and screw with things to fool the machine learning system.”
In that case, you can no longer say a system is 99 percent accurate; it is 99 percent accurate on what was in the past; it is not guaranteed to be correct in the future, he says.
Taking that into account, Chase Lipton thinks there will be “incremental usefulness” of AI systems to secure endpoints. “But what people have to watch out for is a machine learning system can potentially be gamed.
“Obviously, it’s very exciting technology and the capabilities are pretty amazing; the fact that we can [do] high-quality translations between languages and recognize images and generate believable audio and video using generative models,’’ are great use cases of machine learning, he says. “But the problem is, people use general excitement about AI and machine learning to make untethered kinds of [statements] like ‘It’s going to solve security. You don’t have to worry when you use our product.’ That kind of stuff is hooey. But there’s danger of people buying into that because of general excitement about AI,” he says.
AI is being used today to prevent spam and spear phishing attacks and many people are hoping that use of these platforms will mature rapidly, says Paul Hill, a security consultant with at SystemExperts Corp. of Sudbury, Mass. Echoing Chase Lipton, he says “this approach is just as likely to make the attackers step up their game. I worry that the result will be that attackers will develop tools that will make spam that is stylistically identical to the author that they are attempting to impersonate.”
In all cybersecurity AI tools, the learning algorithms need to be more transparent, Hill believes. To fully gain a customer’s trust, “it should be possible for independent third parties to examine the learning model and data. Furthermore, a lot more work needs to be done to understand how an adversary might affect the learning model.”
By manipulating the learning model and/or data used to teach it, it may possible to subvert the AI tools, he says. “Before AI cybersecurity tools enjoy widespread adoption these issues and how they will impact various customer deployments need to be better understood.”
AI in action
Tufts Medical Center is moving an increasing amount of data into the cloud. One of its electronic medical records systems is almost entirely cloud-based and IT is planning to move other clinical systems off premises, says Lehmann.
As the center expands its investigation of using AI to protect endpoints, officials are looking at whether their third-party vendors have appropriate protections in place in their data centers to leverage modern security technologies, he says. Their service level agreements will incorporate language indicating a “high expectation for their security program and mandating they implement certain controls like behavior and deterministic software solutions that protect data well.”
The medical center is also utilizing machine learning to monitor network traffic flowing off premises and protect its connection to the cloud, he says.
“For example,” he continues, “we often see certain spikes in traffic that could indicate an anomaly and … where the promise of AI is, is when we can turn AI on to correct a behavior. We’re getting to this point; not there yet.”
The goal is when there’s a “high fidelity hit on something we think looks bad, telling the AI [platform] to turn it off,” Lehmann says, explaining the medical center is looking at doing this to learn more about what could be threatening.
“Our next step will be to use that same AI to take action about a knowing threatening thing we’ve discovered,” he says. “That’s the nirvana; that’s where the value of AI exponentially increases. Now I don’t have to send a team to investigate that anomalous thing. The system knows what to do immediately if that occurs.”
The bleeding edge
The goal for Lehmann is to be able to walk into any surgical unit at the medical center and know a doctor has “relative assurance” that the equipment, services and procedures will be safe.
“That’s ultimately what we’re trying to do with any spend,” he says. As AI and machine learning technologies mature, he believes IT will be better able to secure endpoints in ways they were previously unable to do — or could only do if they “deployed a team of 50 people to figure it out.”
But when it comes to patient safety, Lehmann is leerier about using AI to secure data being exchanged between their internal systems and systems in the cloud. Although AI holds real value, “Can we say, ‘Is that wireless infusion pump operating normally and delivering drugs in the right frequency and what is has been programmed to deliver?’” Lehmann’s not sure. It becomes a lot trickier for a hospital if an infusion pump gets compromised and starts sending too high a dosage of medicine, he observes.
“These are patients’ lives we’re dealing with and I’m not sure we’re at the point where we can trust AI for [patient care,]” he opines.
For years, people have been recommending that organizations understand their baseline level of network activity in order to deploy a security information and event management system [SIEM] and create useful alerts, notes Hill. “However, many organizations don’t have the resources to really understand what their correct baseline traffic should be. AI should help solve this problem.”
Machine learning has already made available technologies we did not have even five years ago, Chase Lipton notes. “But the kinds of promises being made and way [the technology is] being thrown out vaguely like, ‘We can solve security with AI,’ is a little bit unhinged.”
There are a lot of small victories “probably happening every day,’’ he says. It is easy to train a machine learning system based on data from last year and have it work, “but the problem is, how do you keep it working accurately and develop best practices for auditing it? Those are huge challenges.”
That, for Chase Lipton, would make AI systems more palatable. “I’m sure progress will be slow and steady, but I don’t think it’s an overnight silver bullet that AI will solve in security.”
As endpoint protection evolves, it will need to use data from across multiple endpoints to AI recognize and react to threats, the Gartner report states. To cull all this data, endpoint detection and response (EDR) offerings are starting to emerge. These systems record all the technical and operational data of an organization’s endpoints as well as event, application state and network information.
This gives security response management teams a large pool of data that they can use to search for known indicators of compromise (IoC) or indicators of attack, Gartner says. Already, machine learning is a data analytics technique being used successfully “in areas where lifting signals from noise and removing false positives are problems,” the report says. “A well-trained [machine learning] algorithm can help identify IoCs in large, complex datasets that humans might miss.”
Along these same lines, the Gartner report says user and entity behavior analytics (UEBA) techniques can identify behaviors that applications display that are anomalous to standard baselines.
Yet, the technology is not there yet. “Unfortunately, AI is only beginning to make progress in [endpoint detection response.] However, it seems to be following the same pattern we have seen other technologies (such as SIEM management and network analytics) follow,’’ the report states.
“The technology comes on the market quickly but generates amounts of data that quickly overwhelm human users and contain false positives that limit its attractiveness. AI and advanced analytics are applied, and the tools become easier to use and yield more valuable insights,” the Gartner report says.
The bleeding edge will likely be the day when security administrators can quickly query their environments and take coordinated action across their endpoint environment in a unified manner, maintains Forrester, saying, “Furthermore, new analysis capabilities will present opportunities for endpoint security and management teams to pull deeper and more meaningful business insights from their increasing amounts of endpoint data while lowering operational friction and TCO (total cost of ownership).”
The post With AI, promises still outpace reality appeared first on SC Media.
Go to Source Author: stephenlawton With AI, promises still outpace reality Original post from SC Magazine AI’s value on the endpoint still a work in progress, but it’s improving…
0 notes
Text
Three For One Hosting Review Should I Get It
Three For One Hosting Review - Are you searching for even more expertise concerning Three For One Hosting? Please review my sincere testimonial regarding it prior to picking, to assess the weaknesses and strengths of it. Can it be worth your time and effort and also money?
The most effective web hosting carriers for 2018 (Part 3)
iPage
Possibly the first point most people will observe concerning iPage is the incredibly low price for shared hosting service. At less than two dollars a month for the 36-month plan, you can acquire three complete years of holding for under $75, a jaw-dropping offer despite exactly how you consider it.
Yes, that price will increase after your 36 months are over, but can you condemn them? The company does not assert to use limitless data transfer, yet low-end Three For One Hosting plans generally do not need a tremendous amount of web traffic capacity. If you do sustain a large rise, get in touch with the firm and they'll work with you.
The company currently includes a number of giveaways, also in their lowest-priced plans. They use $100 in advertisement credits both for Google and also Bing, together with complimentary SiteLock web security. We such as exactly how iPage offers 24-7 telephone support and also a 30-day guarantee in addition to its unbelievably low cost. If you get on a budget and want to try out Webhosting, we haven't found a much better beginning rate than what iPage is providing.
SiteGround
SiteGround sits in the middle ground between a customer web hosting carrier as well as those that cater to business services. If you've obtained a local business with more facility web requires than a typical small business, SiteGround is a perfect solution.
Although offerings begin as low as $3.95 monthly, we specifically like the business's GoGeek strategy, which is chock loaded with helpful attributes, consisting of accessibility to a staging web server and also one-click Git repo production.
There's a lot to such as concerning SiteGround, however the firm did lose some points because of its plan of greater than doubling your organizing prices after the first year. The firm calls it a first-year discount, yet that's in extremely little, light grey print.
On the plus side, SiteGround provides free automatic daily back-ups, access to the Cloudflare CDN, high-performance SSDs for all strategies, limitless email accounts, and also assimilation of the totally free Three For One Hosting certificate right into websites. The business does limit data transfer as well as storage, but even those that claim to provide supposed limitless bandwidth and storage really have some limitations in their terms of service.
Regretfully, there is a little a "gotcha" to the complimentary automated backup service. If you're paying $3.95 a month (for the very first year of hosting, then $9.95 a month), you do not obtain recovers totally free. Each restore, regardless of just how little or huge, will cost you $19.95. I'm not sure how I really feel concerning that. On the one hand, the business needs to pay wages to technology assistance associates that can deal with panicking clients. On the various other hand, it seems sort of mean to hit someone when they're down with an included charge. That said, obtaining your data back-- at any type of rate-- is invaluable.
SiteGround is extremely positive concerning securing their clients' protection. They have a devoted security team that creates necessary patches and also internet firewall guidelines that help minimize zero-day susceptabilities. They also make use of an AI-based system to check as well as use fixes to all their web servers dynamically.
Lastly, the firm offers an entire rate of customized venture services. So, if you do GrowBig (as their mid-tier plan is named), you'll be able to stick with the business regardless of how large you obtain.
Web Hosting Center
Three For One Hosting has an entry-level beginning rate, yet some remarkably important benefits for such a low-price participant.
In particular, Web Hosting Hub utilizes BoldGrid as a website building contractor. BoldGrid is actually an add-on to WordPress, so there's no lock-in. This overcomes the major trouble of many site contractors: you're secured right into that host which device, often requiring you to completely restore your site if you wish to broaden. By utilizing a WordPress-based solution, all of the instead significant power of WordPress is offered for future expansion.
We suched as how Webhosting Hub defines its new client process. They inform brand-new clients, "We stroll you through setting up your account in an individual on-boarding telephone call."
The company has a couple of other wins also. They offer an all-SSD framework, automatic vulnerability spots as well as a personalized firewall software, SSH access for sure plans, free site movement and a superb 90-day money-back assurance.
LunarPages
Established back in 1998, LunarPages runs three advanced information centers.
Outfitted with multiple GigE fiber links to the net foundation, the company built out seismically-braced racks and closets, fully-redundant Liebert HVAC cooling systems, a diesel generator that can compete weeks, and also a pre-action dry pipe fire reductions system.
All of this framework has been developed out to support the a wide variety of organizing services. The family-owned firm obtains points for comprehending the relevance of Three For One Hosting protection by using cost-free AutoSSL and also Let's Secure SSL security for its strategies. While the company uses innovative services for technically solid customers, it additionally has a Weebly internet building contractor choice to obtain you up as well as running promptly.
Some plans use SSD efficiency, as well as while there's no uptime tracking offered, the business also acquired a nod for its offering of both Linux and Windows strategies.
For the technically-inclined, SSH access is readily available for a $2/mo upcharge, as is a dedicated IP address. The firm permits back-up via its various control panels as well as allows clients to build scripts to automate the backup procedure. While the firm does not automatically do malware scans, you can ask for one if wanted.
Web Hosting Pad
Web Hosting Pad has a solid worldwide existence. The firm has web servers in United States, Hong Kong, Landmass China and also Korea, and also you can define which web server as well as place you desire when you join.
In terms of what many suppliers call unrestricted service, Host Pad's regards to solution suggest that their meaning of unlimited is what they call "incremental." Basically, as you need much more ability, they wish to review that with you, both to help you get one of the most out of their services, and also to see to it you're using their systems without abusing them.
The business's entry-point prices is very low, and while this will certainly purchase you as much as three years of really inexpensive hosting, do know that its post-promotion cost will certainly raise significantly, placing its subsequent year valuing more according to the rest of its competitors. That stated, we liked its 24-7 phone support, SSD support on some strategies, and 30-day money-back warranty.
BigCommerce
BigCommerce is a bit various from our Three For One Hosting strategies in that it's a SaaS (software-as-a-service) company instead of an IaaS (infrastructure-as-a-service) supplier. In other words, instead of renting space on a virtual maker where you setup and also configure your very own website, BigCommerce supplies you with an app you login to that produces and also on-line shop.
So rather than bothering with web servers and hosting applications, SSH and also cPanel, you're mosting likely to be paying even more focus to business applications you can integrate with, the selling networks you utilize, and the items and also inventory you spotlight.
The key advantage is that, out of the box, you have the ability to host safely, drive traffic, transform visitors, approve repayments, and also ship as well as accomplish orders. There's also an API to ensure that if you do range and require unique assimilations distinct to your business, you can make it happen.
Solution price differs a whole lot. There are really two variables: how much you pay per transaction as well as the functions offered. All plans bill $0.30 per purchase plus a portion.
At the most affordable end, you'll be paying 2.9 percent of the amount you charge. Bumping up to the And also plan obtains you a decreased 2.5 percent of the deal charge as well as adds Google consumer evaluations, the capability for clients to filter their item searches, and customized SSL (a protection certificate you may have already bought).
The Pro plan is pricey, yet you get a great deal. First, you share substantially less of each deal's cost, to 2.2 percent. Then, on top of the Plus plan attributes, you get even a lot more personalized filtering system, price lists, endless API phone calls, can have sales up to $400K per month, and get superior account services.
If your hosting needs lean towards setting up an on-line shop primarily, give BigCommerce a look.
Three For One Hosting Review & Introduction
Developer: Richard Madison
Item: Three For One Hosting
Launch Day: 2018-Dec-05
Launch Time: 11:00 EST
Front-End Price: $32-$ 45
Niche: General
What Is Three For One Hosting?
Richard Madison has actually become well known for his high quality hosting services and products. He took a long hard appearance at what these super affordable hosting business are using for their low cost accounts. Richard has devoted himself to meeting and surpassing what they use in features, while giving your customers far better quality, far better uptime, and also much better assistance. And since he's outshining them on the service, he's also going to outshine them on the cost.
That's why it's called Three For One Hosting. Your buyers will obtain 3 years of Richard's better top quality high worth organizing product, for the price that those various other business bill for one.
And throughout this deal, they'll get it for also less.
Our funnel includes a lot of tested meaningful upgrades, including Cloud holding, attachments like concern backup and FTP storage space, added years of holding, a bundle of clever applications for online company, as well as 100% compensation reseller accounts.
PRO
Expert Solutions
Your website is the first impression your possible consumers, company affiliates and others see. We recognize this and make certain that your web site & e-mail always run efficiently to offer the most effective possible impression.
Faster Loading Websites
Not just have we invested greatly in our enterprise infrastructure with the most recent and biggest Intel Xeon Processors, we integrate ideal of breed software application consisting of cloudlinux, litespeed webserver as well as mariadb to ensure the fastest loading sites.
Best Panel & Tools
Every Organizing strategy utilizes the cPanel control board. This honor winning control board incorporates user friendly with full included. Manage e-mail, domain names, databases & more. Now with Softaculous one-click installer and RV Sitebuilder Pro at no added co.
Why Three For One Hosting?
Faster Loading Sites
Not just have we bought our enterprise cloud infrastructure, we integrate ideal of type software application including cloudlinux, litespeed webserver and also mariadb to guarantee the fastest loading internet sites.
Business Infrastucture
All shared organizing accounts are held on our own facilities utilizing 100% RAID drives, the most recent generation web servers and also just the most recent as well as best Intel Xeon Processors.
Experienced Professionals
Our moms and dad firm has actually been supplying web hosting considering that 2002. With over 16 years experience, we have the experience you need to support you and also assist you deliver the very best internet site for your business.
Unmetered Hosting
We don't charge you based upon the amount of storage space or bandwidth your account utilizes. You are called for to be certified with our Regards to Service as well as guarantee your disk and also transmission capacity utilization is within the typical operations of a personal or small business site.
Danger Free Signup
Our 30-day Cash back Assurance offers you the possibility to try our solutions take the chance of complimentary. We are really confident in our solutions. We provide a fast, protected and reputable service and your complete satisfaction is our top concern.
3 Years for the Rate of One.
At Three For One Hosting, we charge you the one year rate at our competition as well as offer you 3 Years of Holding. A reasonable price that will permit us to provide you with unfailing holding for much less. How? We won't tell you the key behind that magic.
Who Should Buy Three For One Hosting?
If you run a blog, an ecommerce site, or market any type of kind of products or services online, you require hosting ... You require ThreeForOne Hosting!
You require holding you can rely on, without a price tag that eliminates your earnings. ThreeForOne Hosting offers you the very best of both globes: three years of top-notch value added holding for the cost of one!
Conclusion
" It's A Lot. Should I Spend Today?"
Not only are you getting accessibility to Three For One Hosting for the very best rate ever provided, but additionally You're spending entirely without threat. Three For One Hosting include a 30-day Refund Guarantee Policy. When you pick Three For One Hosting, your satisfaction is guaranteed. If you are not entirely pleased with it for any kind of factor within the very first thirty day, you're entitled to a complete refund - no doubt asked. You've got nothing to shed! What Are You Awaiting? Attempt It today as well as get The Following Incentive Currently!
#Three For One Hosting#Three For One Hosting review#Three For One Hosting reviews#Three For One Hosting bonus#Three For One Hosting discount
0 notes
Text
15 Important Facts That You Should Know About SEO 2019
The particular search engine optimization (SEO) is definitely an important design feature associated with the website that enables the particular spider or robot to quickly access it, thereby increasing the visibility on the internet. If you speak about search ranking in SEARCH ENGINE OPTIMIZATION, you're talking about the placement of your content on lookup results pages (SERPs). Kent Lewis, Owner, and President of Anvil, the performance-based firm based out associated with Portland, says that in 2019 voice search and Amazon lookup can become a lot more notable than they were in 2018. Google Search Console (Webmaster tools) is one of the particular SEO basics. If you would like to find out how I actually actually do it, visit Post Czar for any free gift plus details of could use write-up marketing and SEOcious to acquire top Google listings. It really is geared towards SEO professionals (in-house and agency), marketing managers, and business people. But now SEARCH ENGINE OPTIMIZATION considers tweets, retweets, Google+ authorship, as well as other social signals. Besides rank the site, the SEO team turns into fault Client's marketing or product sales team by converting surfers or even visitors into buyers. Because therefore much sharing now takes place on major social media systems, social signals may become simply because important to SEO as period on page, editorial linking, plus content quality. Using social press marketing in PA with SEARCH ENGINE OPTIMIZATION tactics can help boost the website's ranking and popularity. This may create friction and the impact that most well-designed websites are usually very poorly optimized for SEARCH ENGINE OPTIMIZATION. They have impacted SEO in past times as well by bringing inbound links, and today it's even even more. For blog websites the greatest SEO practice would be in order to set the title of your own post in a heading1 label. Search engines is the gatekeeper to huge amounts of traffic and prospects - search engine optimization (SEO) opens the doors. Mainly because long as they are gained naturally, inbound links are possibly the most dependable authority contractors in the world of SEARCH ENGINE OPTIMIZATION. We all call this new methodology AdaptiveSEO so that as its title suggests, it is made in order to adapt to the evolving plus sometimes unexpected changes in research algorithms. Several get confused in this region of SEO article writing suggestions for either they in place too little or maybe the particular wrong kind of keywords, or even they mention the keyword method too much which is occasionally called keyword over stuffing. Social SEARCH ENGINE OPTIMIZATION isn't a separate branch associated with SEO and it won't shortly be replacing traditional SEO, yet social signals are becoming more and more incorporated into search engine methods. Perhaps the particular most important aspect of research engine optimization is how a person can actually leverage SEO in order to assist drive more relevant visitors, leads, and sales for your own business. Just browse through the various types of our SEO blog site to find those important on-page ranking factors. According to him, key phrases have already lost their significance and in 2019 this pattern would only get stronger. Seo (SEO) is definitely the most efficient way in order to drive traffic to your internet site. With internet customers who use their mobiles in order to search on the increase, because an SEO consultant it can make sense to possess a look at the particular effects SEO marketing is putting on search engine optimization. The sole purpose of SEARCH ENGINE OPTIMIZATION Services is to improve your own search engine ranking. Make sure redirected domains redirect through the canonical redirect and this too offers any chains minimised, although Produce sure to audit the backlink user profile for any redirects you stage at a page just such as reward comes punishment if individuals backlinks are toxic (another kind of Google opening up the particular war which is technical seo on a front it's not really, and in fact is speak, to building backlinks to your own site). In order to smoothen out the software system interface problem, the web developing team as well as the particular SEO specialist work together in order to build the major search motors friendly programs and code that could 15 Important Facts That You Should Know About SEO 2019 be easily integrated into the client's website. They will possess to find SEO expert web sites, who will help the company owner's site have many clients in internet marketing. This is due to the fact they are not SEO pleasant and can affect your positioning significantly. These SEO crawler programs are similar to Google's own crawlers and will provide you an overview showing just how your page will perform within SEO rankings. Google is making certain it takes longer to discover results from black and white hat SEARCH ENGINE OPTIMIZATION, and intent on ensuring the flux in its SERPs centered largely on where the searcher is in the world during the particular time of the search, plus where the business is situated near to that searcher. The much better you get at SEO, the particular more traffic - and even more leads - you're likely in order to attract over time. To find out there more read our 2019 styles in SEO marketing report. Are voice searches plus it's expected that by 2019, 67 million voice-assisted devices is usually going to be in make use of in the U. S. These types of kinds of changes and provide on your website shouldn't become a problem logistically — the particular overall practice among the greatest web companies nowadays is making sure that websites are flexible sufficient, especially for SEO purposes. Whether you will be a marketer, webmaster or organization owner, it is very essential invest in voice SEO optimisation to reap benefits in 2019. We said earlier that sociable media isn't a direct SEARCH ENGINE OPTIMIZATION ranking factor, so you're possibly Blog9T wondering why we're even talking about it. The particular effects of Black hat SEARCH ENGINE OPTIMIZATION are temporary, keep in mind that take the particular search engine long before this spots these illegal strategies plus then penalizes you; the lookup engine may spam your hyperlinks and if you continue making use of these malpractices the search motor will altogether block your web site and links. Algorithmic chasers, technical SEOs, plus Google Doodle followers should develop their technical skills to concentrate on emerging voice search technology and AI applications. Single Grain is a electronic marketing agency in order in order to companies like Uber, Amazon plus Salesforce grow their revenues on-line using SEO and paid marketing. Businesses with multiple websites and SEO-agencies can set up report themes. Its search engine marketing group provides expertise in Pay-per-click marketing services, organic SEO and sociable media optimisation. The particular ads that you often notice on various web pages demonstrated and those that also show up on the rightmost side associated with search engine results are inorganic SEO examples. While I nevertheless see this trend in have fun with with many enterprises still within the midst of their electronic transformation, the convergence in the particular MarTech space is creating several synergies and opportunities and this particular should be seen as the welcome development for brands plus agencies who are looking with regard to an edge in regards in order to their SEO driven content marketing and advertising or outreach strategies. Along with the only complete certainty that SEO's, Website owners may have a lot of adjustments to make to web sites under their care in the particular mobile and voice search scenery. The basics of GOOD SEO hasn't changed for years - even though effectiveness of particular elements offers certainly narrowed or changed within type of usefulness - a person should still be focusing upon building a simple site making use of VERY simple SEO best practices - don't perspiration the small stuff, while all-the-time paying attention to the essential stuff - add plenty of unique PAGE TITLES and lots associated with new ORIGINAL CONTENT. Keyword research will be THE first step of any kind of SEO campaign.
Dave Gregory, Content Marketing Supervisor from the UK based efficiency marketing agency, SiteVisibility, predicts that will 2019, and not 2018, is definitely going to be the true year of voice. If we consider Google's Guide then there are almost 200+ factors that lead a internet site in ranking, which we possess researched and clustered in twenty one On Page SEO Factors, that will needs your attention in 2019. SEO Wise Links can automatically link essential terms in your posts plus comments with corresponding posts, web pages, categories and tags on your own blog. 41. An effective interpersonal media strategy needs a strong SEO plan. Google does make a few of this data accessible in their particular free Webmaster Tools interface (if you haven't set up a merchant account, this is a very useful SEO tool both for unearthing search query data and with regard to diagnosing various technical SEO issues). AI and Tone of voice Search Impact SEO, Lets observe how voice search analytics Impact's SEARCH ENGINE OPTIMIZATION and can impact in the particular coming time. A great many businesses determine to hire external help for you to get the full benefits regarding SEO, so a large element of our audience is understanding how to convince their consumers that search is a good investment (and then prove it! ). Nevertheless, SEOs tend in order to prefer links higher on the page. Because your site, credited to the nature of your current business is going to become more image orientated than textual content heavy, you will end up at a small disadvantage when it comes to be able to employing SEO techniques such because keywords, backlinking and so up. An extensive dental marketing plus dental SEO campaign can become attained by the enterprise just if the web address associated with a dental practice includes straight into all the promotional materials intended for the business. This shows the importance of focussing voice search engine results within order to grow your company, marketing, and Search Engine Optimization(SEO) strategies. They obtain this by increasing their site rank through a method known as SEO or even search engine optimization. Just like just about all other SEO approaches, be certain your links are appropriate, plus be careful never to cross the particular line into excessive linking : you don't want your guests to obtain annoyed. Bryan Yeager, Research Director at Gartner, may share 9 Key Insights through Gartner's Marketing Technology Survey in order to Help You Prepare for 2019 and Beyond. SEO is a combination associated with digital marketing efforts all functioning together to increase a web site's value to users and presence in search. On-page SEO (also identified as "on-site" SEO) could be the action of optimizing different parts associated with your site that affect your own search engine rankings. In 2019, we'll have to optimize voice research answers with CTAs that Google's algorithms don't pick up upon, but humans do. Dan Mallette, Lead SEO Strategist at each InVue Digital & HearstDMS, forecasts that SEOs will need in order to optimize for voice search in order to find new avenues as SERP real estate property shrinks. Obviously, a social networking webpage that has more interaction will bring bigger SEO benefits for you to some business than one that will has less interaction, but basically having a social presence is definitely a good start.
Upon the subject of speed, with the beginning of 2017 right now there was still much resistance in order to AMP in the SEO neighborhood overall, but as we mind toward 2018 that feels in order to be dissipating now somewhat along with a reluctant acceptance that AMPLIFIER looks as though it's not really going away sooner. The biggest way that individuals misuse SEO is assuming that will it's a game or that will it's about outsmarting or deceiving the search engines. Both are crucial to the particular success of an SEO advertising campaign, but they're on completely different edges of the fence when this comes to improving your search motor rankings. Greater than 50% of mobile phone customers started using voice search correct from 2015, and so we all can expect that in 2019 and after that not much less than 50% of searches will certainly be in the form associated with voice search. Within the past, getting a great SEO was only about making use of keywords. Remember that SEO is usually about targeting real people, not really only search engines. In case you do these on-page plus off-page elements of SEO with least along with your rivals, you can achieve higher lookup engine ranking positions in the particular organic section of search motor results pages and have the quality website capable of sustaining your revenue goals. In my opinion that 2018 is going to be the particular year where voice search changes how users search and SEOs need to optimize. SEO or Search engine search engine optimization is a term coined collectively to describe the techniques that will the website should use in order to boost its rankings on the search engine. The number a single reason for using video upon your site to improve SEARCH ENGINE OPTIMIZATION is to increase the quantity of time users remain upon your site. Search engine optimization had been but still is fascinating in order to me. The SEO placement intended for any size business begins along with proper web site optimization, a good excellent link building strategy plus a well planned online marketing and advertising plan. One part of focus for higher marketing and SEO performance within 2018 is the confluence associated with content, influence, and social. This can be helpful for SEO, as it helps avoid search engine crawlers from becoming confused by syntax or affirmation errors, and leads to even more accurate indexing. Stop thinking in terms associated with SEO vs. content marketing” plus start exploring how well they will perform together. (Give it a try tone of voice search using OK Google through your cell phone and enquire "What Is BlowFish SEO" ) When all remains as it will be, Google will read out loud almost all about my company in the short to the point method, These cards are formatted in order to fit the screen of your own cell with no scrolling upward or down. Although SEO is really the time-consuming process but believes me personally, if you work well along with dedication and trendy techniques, the particular combined results of on-page plus off-page SEO holds you upon the top with rank #1 for a specific search outcome. Fairly lately, I've seen a resurgence associated with on-page SEO factors making the difference searching engine rankings. There will be no magic wand in your own hands to regulate or manage your competitors' strategies or administration, Google analytics update, or client's behavior communicate business but a person can manage your SEO. In this particular new environment, the digital internet marketer who views SEO in the broader context will definitely come out there ahead of the competition within 2018 and beyond. Good SEO textbooks explain in detail how greatest to use keywords and just how to structure your entire site to attract the attention associated with search engine spiders and associated with human visitors, and a posting such as this cannot perform the topic justice. While businesses begin on an SEO advertising advertising campaign, they should realize that a good entire marketing campaign can drop flat on its face in case a business is unable in order to reach the masses, that will be, their target audience. If your own pages were designed to obtain the most out of Search engines, with commonly known and today outdated SEO techniques chances are usually Google has identified this plus is throttling your rankings within some way. Numerous business people find keeping upward with the "moving target" associated with SEO distracts them from day-to-day priorities more than they actually imagined, so it's good in order to appear closely at what can make sense for every business. The app process for the SocialSEO Electronic Marketing and SEO Scholarship is usually done 100% electronically and needs the next list of materials.
Presently there are a few fundamentals that will can help boost the technique, but SEO, or seo, will be hardly ever considered. While employing the services of a great SEO agency can get a person good marketing copy for the particular website and an effective make use of of the related keywords, getting a brilliantly designed website may help a lot when this comes to attracting people plus even the search engines in order to your website. You are able to get into important SEO-related data like the particular page title, description, and key phrases for every page or a person can have the system immediately populate this data depending upon a document name or various other fields. SEO has become broadly adopted as an online marketing and advertising strategy due to its effectiveness. Creating high-quality content along with SEO in your mind through the beginning boosts search presence. In contrast, dark hat SEO is about attempting to take shortcuts and sport search engines. You may also use Google Analytics in order to find SEO keywords for content material optimization. Keyword research is the particular process SEOs value to discover what search queries consumers get into into a search engine intended for a given topic. Within today's rapidly shifting world, SEARCH ENGINE OPTIMIZATION techniques can change on the dime—and the worst part will be that you might not also know it. Hacks that can have won you a front-page result as recently as 2016 are not only obsolete right now, but they may even harm your website's rankings.
Given that lookup engines have complex algorithms that will power their technology and everybody's marketing needs are unique, jooxie is unable to provide specific SEARCH ENGINE OPTIMIZATION advice to our customers. We all think SEO in 2019 might find a shift to focusing a lot more on user intent, problem resolving, and hyper locality in purchase to capitalize on the carried on rise of voice search. Private demographic information would come in a great deal handier in 2019 as considerably as the ranking of key phrases is concerned. Links are one of the particular most important SEO ranking elements. However, more advanced that will readers will recognize the reduce quality of sites employing dark hat SEO at the expenditure from the reader experience, which usually will reduce the site's visitors and page rank over period. So - THERE IS SIMPLY NO BEST PRACTICE AMOUNT OF CHARACTER TYPES any SEO could lay down since exact best practice to GUARANTEE a title will certainly display, in full in Search engines, at least, since the search little title, on every device. While getting as many pages listed in Google was historically the priority for an SEO, Search engines is now rating the high quality of pages on your own site plus the type of pages this really is indexing. Jerrika Scott, Digital Marketing Specialist from Archway Cards Ltd, also thinks in voice being the craze of 2019 rather than 2018. Today, in spite of all the hype about regardless of whether SEO is dead, we discover that organic search is nevertheless one of the highest RETURN ON INVESTMENT digital marketing channels. If you work in research marketing, you'll know that SMX is one of the greatest search engine marketing conferences associated with the year, covering topics which includes SEO and PPC. SEO is conducted each on-site and off-site via various resources that are the existence of your web identity associated with different social media platform plus prominent display of your cyberspace link on other well-reputed web sites. Our unique data science-driven SEO & content marketing platform can help your eCommerce business discover millions of dollars' worth of formerly untapped organic search marketing possibilities. On the other hands, if the website doesn't make use of any digital marketing strategies or even SEO services, no one troubles to search for pages plus pages on Google just in order to find your website and go to it. Some dentists, who possess tried applying SEO, have not really been very successful in moving their website to the best of Google search engine. Just before we do, let's check away a couple essential areas intended for SEO: social media and cellular. SEO can price between $100 and $500 for each month if you do this yourself with a keyword analysis tool. This SEO guideline explains acquiring links from exterior domains. Let's review the basic principles of SEO (search engine optimization). SEARCH ENGINE OPTIMIZATION is conducted on the knowning that webpages rank because associated with how relevant a webpage is definitely to a search query plus how many links point in order to that webpage. Business professionals try to rely on these SEO techniques intended for optimization wishing for a larger profit. Google My Business is Google's business directory and, thus, super vital that you your local SEO Shown businesses with all the best SEO can appear in the Local 3-Pack, the batch of 3 showcased businesses nearest you that look when you do a pertinent local search. Fire up Visibility recently released SEO: The particular Movie ”. This 40-minute movie covers a brief history associated with search engine optimization portrayed via the experiences of some associated with the biggest names within the particular SEO industry. In addition to producing content offered to search engines, SEARCH ENGINE OPTIMIZATION also helps boost rankings therefore that content will be positioned where searchers will more easily believe it is. The Internet is getting increasingly competitive, and the ones companies that perform SEO may have the decided advantage in visitors plus customers. Unfortunately, SEO - and lookup in general - is usually soloed into focusing on The particular Google” and not really regarded as for other tactics. This means website owners plus SEO experts will need in order to be on top of their own game when it comes in order to keyword research and keeping the particular context of their site appropriate to users. Varvy's SEARCH ENGINE OPTIMIZATION Overview tool audits your internet site for key parameters like domain name strength, links, image SEO, cultural counts, on-page SEO, technical standing, page speed, loading time plus more. The particular trouble is that SEO rating factors have changed a whole lot over the years (find out there how in our keyword study guide ). That means typically the search engine optimization techniques of which worked 5 years back won't take flight today.
Without the doubt, one of the greatest trends that has already started to take place and may continue well into 2018 may be the consolidation of niche MarTech gamers by larger content cloud suppliers, with the role and significance of SEO increasing significantly all through this transformation. SEO Internet marketing offers major components, which develop the particular website traffic, and top lookup engine rankings. SEO is brief for Seo, and there is definitely nothing really mystical about this particular. You might have heard the lot about SEO and exactly how it works, but basically exactly what is a measurable, repeatable procedure which is used to deliver signals to search engines that will the pages are worth displaying in Google's index. Topic clusters possess been lauded since the future associated with SEO and content strategy, yet are widely underreported on (so now's the time to hit! ) 93% of B2B companies use content marketing. Teresa Walsh, Marketing Professional at automobile site, Cazana, forecasts that hyper organic targeting will probably increase its importance in 2019 with more location search plus more voice search. We get to the particular bottom of on-page SEO troubles in order for search motors to clearly see what your own website is all about. SEO requires you to continuously become a student because of just how quickly the algorithms of research engine companies change. Google's punishing methods probably class pages as some thing akin to a poor UX if they meet certain detectable criteria e. g. lack associated with reputation or old-school SEO stuff such as keyword stuffing a site.
0 notes
Text
Ghosts in the Machine
In a brightly lit office, Joy Buolamwini sits down at her computer and slips on a Halloween mask to trick the machine into perceiving her as white.
For Buolamwini, a black PhD student at MIT’s Center for Civic Media, electronic racial deception is sometimes the most efficient way she can do her job. Buolamwini’s research focuses on facial analysis, a suite of technologies used in everything from auto-focusing smartphone cameras to advertisements to border security. But there’s a problem with many of these algorithms—they sometimes can’t detect Buolamwini or people who look like her.
Joy Buolamwini often tests her software using a mask to overcome biases encoded in facial recognition algorithms.
That’s because facial detection algorithms made in the U.S. are frequently trained and evaluated using data sets that contain far more photos of white faces, and they’re generally tested and quality controlled by teams of engineers who aren’t likely to have dark skin. As a result, some of these algorithms are better at identifying lighter skinned people, which can lead to problems ranging from passport systems that incorrectly read Asians as having their eyes closed, to HP webcams and Microsoft Kinect systems that have a harder time recognizing black faces, to Google Photos and Flickr auto-tagging African-Americans as apes.
Coded machine bias can work against lighter-skinned people as well. Research shows that some facial analysis algorithms built in Asia tend to perform better with Asian faces than Caucasian ones. Algorithms may also show accuracy rates that vary along age or gender lines.
As computer vision systems become more widespread, these demographic effects can have serious consequences. A seminal 2012 study of three facial recognition algorithms used in law enforcement agencies found that the algorithms were 5–10% less accurate when reading black faces over white ones and showed similar discrepancies when analyzing faces of women and younger people. A 2010 analysis by the National Institute for Standards and Technology (NIST) found that for some algorithms the opposite was true, that people of color were more easy to identify than Caucasians. But both studies showed that facial recognition programs aren’t equal opportunity. Bias in algorithms extends well beyond facial recognition, too, and into things as disparate as car insurance rates and recommendations for criminal sentencing.
Joy Buolamwini wasn’t aware of these issues when she first encountered algorithmic bias as a computer science major at the Georgia Institute of Technology. Working on a research project that involved teaching a robot to play peek-a-boo, Buolamwini noticed that the robot had no trouble detecting faces of her light-skinned roommates. But under the same lighting conditions, it didn’t work as well for her. She encountered the same problem in 2011 with another robot, but she didn’t think much of it until she began working more directly with facial recognition at MIT. One of Buolamwini’s early projects—a system called the Aspire Mirror that layers inspirational images, quotes, or even other faces over a reflection of the user’s face—worked well for users with lighter skin, but not so much for the woman who built it.
“I was getting frustrated. I drew a face on my palm and held it up to the camera and it detected the face on my palm. I was like ‘Oh this is ridiculous,’ ” she says. “Just being goofy, I put the white mask on to see what would happen, and lo and behold, it detected the white mask.”
Flawed Benchmarks
Facial analysis bias remains a problem in part because industry benchmarks used to gauge performance often don’t include significant age, gender, or racial diversity. For example, one popular benchmark for facial recognition is Labeled Faces in the Wild (LFW), a collection of more than 13,000 face photos. Tech giants including Google, Facebook, and Baidu—as well as a variety of smaller companies—have used the data set to measure algorithmic performance. LFW includes photos that represent a broad spectrum of lighting conditions, poses, background activity, and other metrics, but a 2014 analysis of the data set found that 83% of the photos are of white people and nearly 78% are of men. “It’s not necessarily diverse identities,” Buolamwini says.
A sample of faces from the research dataset known as Labeled Faces in the Wild.
Erik Learned-Miller, a computer science professor at the University of Massachusetts, Amherst, who co-created the data set, agrees that benchmarks like LFW “cannot be depended upon to evaluate algorithms for their fairness in face identification.” When LFW was released in 2007, he says, that was never the intention. Learned-Miller says that it’s critical for facial recognition vendors to conduct “exhaustive evaluations” of the technology’s accuracy—and not just on one group of users. But he suspects that many don’t as there are few financial incentives to do so.
How bad is the bias problem? There isn’t a lot of research on the subject. There’s “limited evidence” of bias, racial or otherwise, in facial analysis algorithms, in part because there simply haven’t been many studies, says Patrick Grother, a computer scientist specializing in biometrics at the National Institute for Standards and Technology and lead author of the 2010 NIST study.
“There are anecdotes that certain people have trouble using them,” he adds. “But nobody has formally quantified it, and to formally quantify it, you would need a large amount of data.”
Buolamwini is one of a growing number of researchers fighting the problem. She is joined by a team of volunteers who support her nonprofit organization, the Algorithmic Justice League, which raises awareness of bias through public art and media projects, promotes transparency and accountability in algorithm design, and recruits volunteers to help test software and create inclusive data training sets. Buolamwini’s goal isn’t just to improve algorithms—it’s also to make AI more understandable and accessible to everyone.
“This domain of algorithmic bias is inhabited by the digerati or the Brahmin high priests of tech,” Buolamwini says. “But these aren’t necessarily the people who are going to be most affected by the decisions of these automated systems…What we want to do is to be able to build tools for not just researchers, but also the general public to scrutinize AI.”
Bias Busters
Exposing AI’s biases starts by scrapping the notion that machines are inherently objective, says Cathy O’Neil, a data scientist whose book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, examines how algorithms impacts everything from credit access to college admissions to job performance reviews. Even when parameters that guide algorithms are completely reasonable, she says, discriminatory choices can still sneak in through the cracks.
“When you make reasonable choices without thinking very hard about them, you’re automating the status quo, and the status quo might be unreasonable,” she says.
If, for example, a company wants to automate its hiring process, it might use an algorithm that’s taught to seek out candidates with similar profiles to successful employees—people who have stayed with the company for several years and have received multiple promotions. Both are reasonable and seemingly objective parameters, but if the company has a history of hiring and promoting men over women or white candidates over people of color, an algorithm trained on that data will favor resumes that resemble those of white men. Rejected applicants will probably never know why they didn’t make the cut, and it will be tough to hold anyone accountable since the decision was machine-made, not manmade, O’Neil says.
NIST computer scientist Ross Micheals demonstrates a NIST-developed system for studying the performance of facial recognition software programs.
O’Neil adds that algorithms don’t need explicit data on race, gender, or socioeconomic status to exhibit bias. Risk assessment tools used in commercial lending and insurance, for example, may not ask direct questions about race or class identity, but the proprietary algorithms frequently incorporate other variables like ZIP code that would count against those living in poor communities.
Credit scores are another data point that can allow bias to creep into algorithms. In 2015, Consumer Reports published the result of a two-year-long investigation into car insurance pricing. They analyzed more than 2 billion price quotes across approximately 700 companies and found that a person’s financial life dictated their car insurance rate far better than their driving record. Credit scores—which are affected by factors related to poverty but often not related to driving—factored into these algorithms so heavily that perfect drivers with low credit scores often paid substantially more than terrible drivers with high scores. In Florida, for example, an adult driver with a pristine record but a low credit score paid $1,552 more on average than a driver with great credit and a drunk driving conviction.
When practices like this are automated, it can create negative feedback loops that are hard to break, O’Neil says. Higher insurance prices for low-income people can translate to higher debt and plummeting credit scores, which can mean reduced job prospects, which allows debt to pile up, credit scores to sink lower, and insurance rates to increase in a vicious cycle. “Right now we have essentially no rules or regulations around algorithms, about the accountability of algorithms in particular,” O’Neil says.
Just knowing when an algorithm has made a mistake can be difficult. Last year, the investigative journalism nonprofit organization ProPublica released an analysis of COMPAS, a risk assessment tool that evaluates criminals to determine how likely they are to commit future crimes. They compared predicted recidivism rates of 10,000 criminal defendants in Broward County, Florida, with whether the defendants committed a crime over the next two years. The algorithm was equally accurate at predicting recidivism rates for black and white defendants, but black defendants who didn’t re-offend were nearly twice as likely to be classified as high-risk compared with similarly reformed white defendants. By contrast, white repeat offenders were twice as likely to be erroneously labeled as low-risk.
Equivant, the company behind COMPAS, rebutted ProPublica’s findings, and a separate analysis by Community Resources for Justice supported Equivant. ProPublica countered, but sussing out who’s correct underscores a major obstacle: Assessing an algorithm’s fairness depends on having an agreed-upon definition of fairness. In situations where there are multiple potential outcomes, as is the case with COMPAS, it may not be mathematically possible to assess different groups in various scenarios, concluded an independent analysis by researchers from Cornell and Harvard.
“Everybody is in some sense right,” says Andrew Selbst, an attorney and postdoc at the Data and Society Research Institute who specializes in legal questions surrounding big data and machine learning. “The real issue is that we have, for a long time, been able to avoid being very clear as a society about what we mean by fairness and what we mean by discrimination.”
Law and Order and AI
There are laws that could provide some protection against algorithmic bias, but they aren’t comprehensive and have loopholes. Current anti-discrimination laws in sectors like education, housing, and employment prohibit both intentional discrimination—called “disparate treatment”—as well as unintentional “disparate impact,” which happens when neutral-sounding rules disproportionately affect a legally-protected group. (It’s currently against the law to unintentionally discriminate on the basis of sex, age, disability, race, national origin, religion, pregnancy, or genetic information.)
Proving disparate impact is notoriously difficult even when algorithms aren’t involved. Plaintiffs must first prove that they were disproportionately and negatively affected by a policy or practice. If discrimination is job-related, for example, the disparate impact would only be illegal if there were alternative hiring methods that were equally effective without being discriminatory. With algorithms, Selbst says, clear alternatives may not exist.
“If someone had actually stepped up in the making of this and tested several different versions of the [software], then there would be alternatives and you should choose the one that’s the least discriminatory and most effective,” Selbst says, adding that organizations often don’t have incentive to fully evaluate software for fairness concerns. “If we want to have best practices, we should be testing a lot of versions of the software and not just relying on the first one that we’re presented with.”
Since algorithms are proprietary and frequently protected under non-disclosure agreements, organizations that use them, including both private companies and government agencies, may not have the legal right to conduct independent testing, Selbst says.
“It’s not completely clear that the companies or police departments or judiciaries that buy this software from these companies have done any testing whatsoever. In fact, it is clear in many cases that they don’t, and they’re not allowed to,” Selbst says.
The ability to audit an algorithm would answer some questions about bias, but there’s a group of algorithms that are moving beyond our current abilities to analyze them. Artificial neural networks are one example. They are fed huge amounts of data and, through a process of breaking it down into much smaller components and searching for patterns, essentially come up with their own algorithms, which could potentially be incomprehensible to humans.
Computer scientists have also developed artificial neural networks that can write new AI programs without human input. Just this month, Google announced a major advance in this field—a system that writes its own machine learning code, one that out-performed code developed by its own makers.
Ben Shneiderman, a computer scientist at the University of Maryland, says that greater automation means greater concern over whether machine-built algorithms will exacerbate the bias problem.
Machine-built algorithms are “somewhat more concerning because if you’re automating a process, then you’re reducing the opportunities for a human being to check the bias,” he says. “The central technical improvement I’d like to see is a log of the actions of the algorithms in a way that’s interpretable and explainable.”
Explaining AI
Some researchers are working on that problem. Last May, the Defense Advanced Research Projects Agency (DARPA), home to the U.S. government’s most top-secret research and weaponry programs, launched a four-year initiative to encourage the development of new machine learning techniques which produce algorithms that can be understood by users. The 12 research teams received contracts under the Explainable AI program aim to help military forces understand the decisions made by autonomous systems on the battlefield and whether that technology should be used in the next mission, says David Gunning, Explainable AI’s program manager.
Some Explainable AI teams are adopting what Gunning calls a “model induction” approach that essentially reverse engineers how an algorithm makes a choice. Instead of starting with the algorithmic recipe itself, these teams are building software programs that run millions of experiments that compare the data going into the system with the decisions that come out in hopes of finding enough patterns. From that, they’ll create a model of what they think is happening in between.
Explainable AI’s products aren’t being designed to root out bias, but what results from this program could help researchers evaluate algorithms and spot bias in the future. Gunning is quick to point out that the Explainable AI program has only just begun— right now, no single strategy looks more promising than another—and that there will perhaps be some aspects of artificial intelligence that we will never understand.
“I don’t think we’ll ever create perfect explanations for the most complex machine learning systems,” he says. “I think they’ll always be inherently difficult to explain.”
Overseeing the Indecipherable
That’s one reason why some are encouraging researchers and programmers to consider bias as they’re building tools. One group, the Partnership on Artificial Intelligence to Benefit People and Society, brings together academics, ethicists, and representatives from Amazon, IBM, Facebook, Microsoft, Google and Apple to hash out best practices. Other organizations like the AI Now Institute have issued recommendations for AI researchers, developers, and policy makers while agencies like the National Institute of Standards and Technology are beefing up their programs for rooting out bias. Starting this past February, NIST began offering evaluations of facial recognition technologies on an ongoing basis rather than every two to four years. This allows developers to continually check how their algorithms perform across diverse data sets.
Shneiderman says programs like this are a step in the right direction, but voluntary quality controls are not sufficient. He has proposed establishing a National Algorithm Safety Board which would oversee high-stakes algorithms and investigate problems. Operating similarly to the way the National Transportation Safety Board investigates vehicular accidents, Shneiderman’s safety board would be an independent agency that could require designers to assess the impact of their algorithms before deployment, provide continuous monitoring to ensure safety and stability, and conduct retrospective analyses of accidents to inform future safety procedures.
While Shneiderman’s proposed board, like the NTSB, wouldn’t have regulatory power, it would have significant investigative powers. “There needs to be some teeth” to the oversight board, Shneiderman says. “If they don’t have the power to investigate in a substantive way, if the people involved won’t talk to them, then there’s a limitation to how much they can accomplish.”
Some of these issues will probably be resolved through lengthy litigation, a process that’s already begun. Last year, the New York Supreme Court ruled that an algorithm used to evaluate a fourth-grade public school teacher’s job performance produced “arbitrary and capricious” results that were biased against teachers with both high and low-performing students.
For the foreseeable future, Andrew Selbst says we should expect more lawsuits and regulatory activity as the field strives to establish standards for algorithmic transparency and accountability. “All of this is cutting edge research in law and technology,” he says. “It’s all sort of up in the air right now.”
0 notes