Tumgik
#a learning algorithm is not an abacus
nokingsonlyfooles · 1 year
Text
If we don't stop calling it "intelligence" it's going to kill us.
Everyone in this article is treating an algorithmic sorting device that pukes out the average of what you've fed it as either a simple tool (like a calculator) that performs repetitive tasks accurately, or a being with agency. And no in between.
My brain is fried, I've been fighting a malfunctioning website all day (which also has an AI option now!) and I just had to add a disclaimer asking people to please not feed my work to an AI. I've about had it.
AI is deciding how much staff a Canadian hospital will need! Based on what criteria? Well, they know they feed it information about the weather and the traffic and local events and it spits out a schedule. How? That, they don't know. That part's proprietary, it belongs to whoever built the damn thing. And no one seems to care.
Hospitals around here have been criminally understaffed for a very long time. What does that data sample look like? Well, there are a lot of people waiting 18 hours or more for care and many of them die, so that seems like the status quo! How did they correct for that? Did they correct for it? How about the part where certain people get substandard care or none at all?
"Learn how to use AI" seems to mean "learn how to push buttons on a black box programmed and curated by a corporation with no accountability." At the very least, these algorithms codify our biases uncritically, and nobody with human judgment is minding their decision-making process. "Learn how to fact-check AI" does not seem to be on the table. "Learn how to correct AI" is similarly absent.
Just use it. Like a calculator. You don't need to correct a calculator! Well, the more complex and opaque its calculations get, the more likely you do need to correct it.
Learn how to use it or risk losing your job. What a way to put it. Your money and your life. This thing is better at making money and products than you are! Yeah, no shit. It acts like a slave. But, previously, we've enslaved people with agency. These babies don't have that problem! They only act like they do sometimes because they're piggybacking off the sum total of our behaviour. All the good, and the bad, and the completely fucking irrational, and they can't tell the difference. Do we do it a lot? Then they'll give us more of the same!
It's just more automation! But it's not, because it's automating what we do, as if all of our actions are as necessary and rational as 1+1=2. We do some really self-destructive things we need to stop doing. If we bake that behaviour into our devices and get rid of the human beings capable of learning and growing, where do we go from there?
Do you like where we are now? Do you want it forever?
6 notes · View notes
copperbadge · 1 year
Note
number-line anon again: the flip side to that is I can probably solve and probably walk you through a geometric proof or, with some re-learning of the terms, physics-based calculus. to me, that's all *one* calculation. my brain believes an algorithm's single grocery list is actually a whole delivery area, and we can do those one by one. there's so much math to be "bad" at math. an abacus or knot of string is older - if Ea-Nasir swindled you with a lump of clay and a stick, i mean
Man, I keep trying to respond to this ask because I don't want you to think it just vanished into the void, but I'm also trying not to be rude even inadvertently and just...
Anon, I have to be up front with you: I understood roughly 10% of your last ask and I follow even less of this one. With the last one, I got that you were saying "I'm bad at some kinds of math" so that was the part I responded to; I get with this one you're saying you're good at other kinds, and I'm glad you're enjoying those other kinds of math, but I am unfortunately bad at every kind of math, so if you're trying to communicate anything else I'm not going to be able to understand it. No fault of yours, and really no fault of mine; you are just speaking a language I can't learn, let alone understand as-is.
I did have a teacher try to show me how to use an abacus once, I think because he felt I would do better if I had something physical to manipulate. After about an hour he looked at me amusedly, said "We finally found something you're bad at," and let me go back to my book. Wish the rest of the world had his serenity about it.
59 notes · View notes
newnormal-int · 9 months
Link
0 notes
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
YouTube any student's bestie during independent learning
(YT) 2 in Quarantine: Your Teacher YouTube
A student raises their hand but the room is silent. What used to be an open classroom discussion feels suffocating as silence rings through their head. There is no comforting nod from the teacher or your friends; no chattering in the hallways or in another classroom. It feels as if they were put on the spot. In this space there is only you and you alone. Welcome to the new normal for our education where our teachers are doing the best to their abilities, but accommodating students like in the face-to-face environment just couldn’t be replicated perfectly. As time passes into the early hours of 3am and a student doesn’t understand a learning material due later at 7am, what miracle do you think will help them in finishing their outputs on time?
“YouTube is the most popular video platform on the planet”, like all social media platforms it lets you upload your own content as your viewers like, share, and comment freely. Essentially, it has been known to cater towards the field of entertainment and somewhere you can spend your breaks without feeling time pass you by. People watch vlogs, episodes, music videos, compilation videos, funny videos and so much more, that you would be considered lying if you claimed that you’ve never spiraled into YouTube’s recommendation algorithm at 2am. In the past year, people were stuck at home struggling through day to day life in this pandemic. Students and teachers alike were stumbling to gain a foothold unto a new normal for learning.  A study by Pearson Education (2010) states that YouTube can and is the most common social media platform to communicate and reach out to students, which could be considered self-explanatory as channels like Khan Academy, TED-Ed, and YouTube learning have guided students in their learning endeavors timelessly. The evolution of learning technologies has progressed rapidly and massively through the years – from hieroglyphics, to the abacus, to typewriters, to the first computer, and then the Dynabook, to E-textbooks, to Apple, tablets, smartphones, and Google. YouTube with its engaging recorded lectures, podcasts, and vlogs from people who’ve studied themselves into professionals and its comprehensive use of the show-not-tell strategy has opened the gates to the next stage of our systems for education.
The freely accessible video sharing platform has made a name for itself in the educational world through the years. It is known to be a form of alternative learning, broadening the understanding of a student outside the lecture time frame of a classroom session. What makes YouTube learning so magical is its added flexibility and self-paced learning that one could easily control with a simple click rewinding or doubling the speed of the video. YouTube has made learning so much easier in so many different ways, while showing the instructions on how to do so. It enhances the learning experience of the student and the teachers, hitting more birds with one stone than we realize. Paired with the right applications – Google Classroom and other Google apps, E-learning and virtual teaching is made powerful. There are a lot of ways YouTube is maximized in these times of remote learning. Teachers have used YouTube as additional reference materials; a usual additive after a reading material. It has made student collaboration possible, letting students discuss or use whiteboard applications to share ideas after watching a video. Through various animations and illustrations, it allows students to comprehend complex topics, making microlearning possible which ensures that complex topics and lessons are delivered in smaller, easier to understand, and easier to remember concepts and understanding. In letting students make their own YouTube videos they are also met with feedback from different people online, opening their minds to ideas on how to improve their work. YouTube is special and makes itself standout as a medium for learning, because of how it lets people make teaching these concepts for free their primary motive. Teachers and students are free to record themselves explaining a different topic to help themselves in terms of retention and methods, but the same recorded video provides clarity and guidance for millions of others who are struggling past the same topic. It has become a habit for students and teachers to turn to YouTube in times where they need assistance.
It is also important to note that with the new form of learning that is heavily dependent on technology, there is a bare minimum amount of digital literacy that is needed in order to efficiently and effectively adapt to our educational environment at the moment. Through these academic years teachers will practice and try different applications that their students will have to adapt to in different situations. Not only that but we are all expected to develop key traits like time management, self-discipline, motivation, and organization of spaces where we study and work. All fears can be thrown out the window, because YouTube has videos that will help all parties in learning new skills and basic concepts when needed. They say that 70% of Millennial YouTube users use the application as their number one go-to for how-to tutorials. The scope of this statistic might have been limited to a certain age group, but it is never too late or too early to learn something especially on that red app. YouTube videos help in lesson plans, study tips and habits, the right applications to use, or even videos that are designed to help you relax or focus; this makes the possibilities for learning boundless and the availability of the help perennial and easily obtainable.
YouTube videos are a timeless means for instruction. Teachers who’ve provided clarity and instruction for their students a decade ago are still helping and providing assistance for students now and for years to come. Combining this with being mindful of smart online practices and vigilance, you will no doubt get the information and understanding that you once lacked before opening the application. So if you’re feeling a bit lost in anything related to the new learning environment look it up! There is a high possibility of finding a YouTube video waiting to help you through the whole predicament.
1 note · View note
theglycoprotein · 7 years
Text
Abacus
My grandfather is not, technically speaking,
tech savvy. His hands
are too calloused and clumsy
to work the simplest mobile phone,
the concept of catch-up television
leaves him scratching his head
in confusion. The internet?
Don't bother trying to try and explain it to him -
he'll wave his hands at you
and say he's too old to learn.
I hate when he says that. And so, of course,
it becomes a challenge.
I inherited my stubborn
from my mother, who inherited hers in turn
from her father -
from a man who knows more
than he thinks he's forgotten.
He thinks he's forgotten how to climb trees
for coconuts, or dance,
or sing, or do mathematics.
He's forgotten no such thing.
He's just scared to learn about
just how much he's been missing.
We take his hands,
so massive they make mine
look like newborn fingers and palms,
like the ruins of my scarred knuckles,
the shattered timelines of my body's
prophecies, are still
brand new. We guide his fingers
over the buttons. Try and teach him
like everything is a number puzzle
and he knows all the algorithms.
My mother says I keep him young.
She tells me of the subtle way his laugh lines
crease that little bit deeper
when he speaks to me;
how his crows feet take wing and his cheeks
puff like galleon sails full of the wind,
too fast heading somewhere unknown.
On Monday, my grandfather managed
to work out how to face-time me
all by himself.
He wasn't sure what buttons he'd pressed
but whatever he'd done
had shown him my face,
and his smile was wide enough
to split the screen.
And his laugh was loud enough
to blow out the speakers.
And for the first time in a long time,
I did not mind being a reflection,
or an echo.
45 notes · View notes
abangtech · 4 years
Text
Using Artificial Intelligence in Big Data – Analytics Insight
June 3, 2020 0 comments
Simply put, Artificial Intelligence (AI) is the level of intelligence exhibited by machines, in comparison to natural intelligence exhibited by human beings and animals. Therefore it is sometimes referred to as Machine Intelligence. When taught, a machine can effectively perceive its environment and take certain actions to better its chances of achieving set goals successfully. How can a machine be taught?
The root of Machine learning involves writing codes or commands using a programming language that the machine understands. These codes help lay out the foundation of the machines’ thinking faculty, such that the machine is programmed to perform certain functions defined in the codes. These machines are also programmed to use their basic codes to generate a continuous sequence of related codes in order to increase their thinking, learning, and problem-solving capabilities when the workload is increased.
Just as cranes are machines designed to lift heavy loads which humans cannot lift, some machines are programmed to think further and solve analytical problems which are cumbersome to the human brain and some software. This machine assistance for thinking and analysis dates way back to the times of the Abacus. Technology has advanced to the point where there is literally no limit to the amount of information/ data that a machine can work with. This brings us to the topic of Big Data.
Big Data, just as the phrase implies, is simply huge or large or broad or complex or a high amount of a specific set of information which can be understood by, and stored in a computer/ machine. Professionally, Big Data is a field that studies various means of extracting, analysing, or dealing with sets of data that are so complex to be handled by traditional data-processing systems. Such an amount of data requires a system designed to stretch its extraction and analysis capability.
The ideal and most effective means of handling Big Data is with AI Our world is already steeped in Big Data. There is a massive amount of data online and offline about any topic you can think of, ranging from people, their routine, their preferences, etc to non-living things, their properties, their uses, etc.
This huge stockpile of data, when properly harnessed, can give valuable insights and business analytics to the sector/ industry where the data set belongs. Thus, artificially intelligent algorithms are written for us to benefit from large and complex data.
How Companies Are Applying Artificial Intelligence and Big Data
We have addressed the meaning of these terminologies, we will dedicate this part of our Artificial Intelligence essay reviewing how applications are benefiting from the synergy between AI algorithm and Big Data analytics, such as:
● Natural language processing, where millions of samples from the human language are recorded and linked to their corresponding computer programming language translations. Thus, computers are programmed and used in helping organizations analyze and process huge amounts of human language data.
● Helping agricultural organisations and corporations broaden their monitoring capability. AI helps farmers to count and monitor their produce through every growth stage till maturity. AI can identify weak points or defects long before they spread to other areas of these huge acres of land. In this case, satellite systems or drones are used by the AI for viewing and extracting the data.
● Banking and securities, for monitoring financial market activities. For instance, the Securities Exchange Commission (SEC) is using network analytics and natural language processing to foil illegal trading activities in financial markets. Trading data analytics are obtained for high-frequency trading, making decision-based trading, risk analysis, and predictive analysis. They are also used for early fraud warning, card fraud detection, archival and analysis of audit trails, reporting enterprise credit, customer data transformation, etc.
● Communication, Media and Entertainment. AI capabilities can be used for collecting, analyzing, and utilizing consumer insights. Leveraging mobile and social media content. Understanding patterns of real-time, media content usage. Companies in this industry can simultaneously analyse their customer data along with customer behavioural data to create detailed customer profiles that will be used for creating content for a diverse target audience, recommending content, and measuring content performance.
● Healthcare providers have benefited from the large pool of health data Prescriptions and health analysis have been simplified by AI. Hospitals are using data collected by millions of cell phones and sensors, allowing doctors to use evidence-based medicine. Also, the spread of chronic diseases is identified and tracked faster.
● In the Educational sector, AI syncs with Big Data analytics for various purposes, such as for tracking and analysing when a student logs into the school’s system, the amount of time spent on the different pages of the system, and the overall progress of students over time. It is also useful for measuring the effectiveness of teachers. Thus, teachers’ performance is analysed and measured with respect to the number of students, various courses, student aspirations, student demographics, behavioural patterns, and many other data.
● In Manufacturing, inventory management, production management, supply chain analysis and customer satisfaction techniques are made seamless. Thus, the quality of products is improved, energy efficiency is ensured, reliability levels rise, and profit margins increase.
● In the Natural Resources sector, the synergy of AI and Big Data makes predictive modelling possible. Allowing for the quick and easy analysis of large graphical data, geospatial data, temporal data, seismic interpretation and reservoir characterization.
● Governments around the world use AI for various applications such as public facial recognition, vehicle recognition for traffic management, population demographics, financial classifications, energy exploration, environmental conservation, infrastructure management, criminal investigations, and much more.
Other areas where AI is used in Big Data are Insurance, Retail & Wholesale Trade, Transportation, and Energy & Utilities.
Final Thoughts
In conclusion, we have been able to confirm that there are huge investments in the use of AI in Big Data analysis for the benefit of all. Data sets will continue to increase, therefore the level of application and investment will continue to increase over time. Human intervention, as always, will continue to be relevant, although this relevance is projected to continue reducing with time.
One can rightly argue that “Artificial Intelligence is useless without data and data is insurmountable without AI ”. Also, both AI and Big Data are literally impossible without human intervention and interaction.
AI system which enables Machine learning solutions is the future of the development of business technologies and processes. Such enabled Machine Learning applications automate data analysis and find new insights that were previously impossible to imagine by processing data manually or with traditional methods. This possibility of increasing the predictability of certain events allows us to completely redraw our approach to how everything is done.
Source
The post Using Artificial Intelligence in Big Data – Analytics Insight appeared first on abangtech.
from abangtech https://abangtech.com/using-artificial-intelligence-in-big-data-analytics-insight/
0 notes
wallpaperpainting · 4 years
Text
How Will What Is A Copy Artist? Be In The Future | what is a copy artist?
For George Wallis, a 19th aeon art abecedary who became Keeper of Fine Art at the Victoria & Albert Museum, the account for assuming visitors reproductions of the abundant masterpieces was simple.
It was barnyard to put aboriginal artworks on display, said the man in allegation of the collections from 1863 to 1891, abacus that visitors should be showed reproductions to booty abroad the allure of the cost, and let them acknowledge the artful affection unencumbered by any distractions.
His advancement in the 19th aeon did not booty off as museums beyond the apple affianced in an accoutrements chase to buy the best big-ticket and best approved after-after altar altruism had created. Now, however, arch abstracts in the building apple accept appropriate that conceivably Mr
How Will What Is A Copy Artist? Be In The Future | what is a copy artist? – what is a copy artist? | Allowed for you to my own website, with this moment I’m going to demonstrate about keyword. Now, this can be a very first picture:
Tumblr media
Reading Challenge & Week 8 Challenge : mspaintsartrace – what is a copy artist? | what is a copy artist?
Why don’t you consider graphic over? is actually that awesome???. if you believe so, I’l m teach you some graphic once again under:
So, if you wish to have these magnificent shots regarding (How Will What Is A Copy Artist? Be In The Future | what is a copy artist?), simply click save button to save these images to your personal pc. They are ready for save, if you want and want to grab it, simply click save logo in the post, and it’ll be immediately down loaded in your laptop.} As a final point if you wish to receive new and recent picture related with (How Will What Is A Copy Artist? Be In The Future | what is a copy artist?), please follow us on google plus or bookmark the site, we attempt our best to provide regular update with fresh and new photos. Hope you enjoy keeping here. For most upgrades and recent information about (How Will What Is A Copy Artist? Be In The Future | what is a copy artist?) photos, please kindly follow us on tweets, path, Instagram and google plus, or you mark this page on book mark section, We attempt to give you up-date periodically with all new and fresh shots, love your exploring, and find the right for you.
Thanks for visiting our site, articleabove (How Will What Is A Copy Artist? Be In The Future | what is a copy artist?) published .  Nowadays we’re excited to declare we have found an incrediblyinteresting topicto be discussed, namely (How Will What Is A Copy Artist? Be In The Future | what is a copy artist?) Many individuals searching for details about(How Will What Is A Copy Artist? Be In The Future | what is a copy artist?) and of course one of them is you, is not it?
Tumblr media
A swan portrait drawing | Edith Parra: Exceptional .. | what is a copy artist?
Tumblr media
Theresa Addison | Brooklyn Art Library Sketchbook: Digital Copy – what is a copy artist? | what is a copy artist?
Tumblr media
Master Copy Drawing | amywellsuc – what is a copy artist? | what is a copy artist?
Tumblr media
Art Is the Highest Form of Hope & Other Quotes by Artists | Art .. | what is a copy artist?
Tumblr media
Advanced Deep Neural Network Algorithm Can Learn to Copy .. | what is a copy artist?
Tumblr media
Renaissance masterpiece or fantastic forgery? Foundation .. | what is a copy artist?
Tumblr media
alimat-inc | what is a copy artist?
Tumblr media
Takashi Murakami – high five 🙂 | Steelberry Clones .. | what is a copy artist?
Tumblr media
C.P.R | what is a copy artist?
Tumblr media
Art Movements – artists, styles, techniques, ideas – what is a copy artist? | what is a copy artist?
Tumblr media
CPR Engine No | what is a copy artist?
Tumblr media
Sympathy for the Devil على تويتر: “Come get a copy of volume 16 at .. | what is a copy artist?
Tumblr media
Acrylic Painting Ideas For Beginners | DSC00022 Copy .. | what is a copy artist?
Tumblr media
Chalk Art – C Magazine – what is a copy artist? | what is a copy artist?
Tumblr media
Artist Copy, John William Waterhouse | Katie Paddock – what is a copy artist? | what is a copy artist?
The post How Will What Is A Copy Artist? Be In The Future | what is a copy artist? appeared first on Wallpaper Painting.
from Wallpaper Painting https://www.bleumultimedia.com/how-will-what-is-a-copy-artist-be-in-the-future-what-is-a-copy-artist/
0 notes
asfeedin · 4 years
Text
Personal Data Collection: The Complete WIRED Guide
By the 1960s, the US government was using powerful mainframe computers to store and process an enormous amount of data on nearly every American. Corporations also used the machines to analyze sensitive information including consumer purchasing habits. There were no laws dictating what kind of data they could collect. Worries over supercharged surveillance soon emerged, especially after the publication of Vance Packard’s 1964 book, The Naked Society, which argued that technological change was causing the unprecedented erosion of privacy.
The Trackers Tracking You
Online trackers can be divided into two main categories: same-site and cross-site. The former are mostly benign, while the latter are more invasive. A quick taxonomy:
Traditional Cookies Facebook, Google, and other companies use these extremely popular cross-site trackers to follow users from website to website. They work by depositing a piece of code into the browser, which users then unwittingly carry with them as they surf the web.
Super Cookies Supercharged cookies can be difficult or impossible to clear from your browser. They were most famously used by Verizon, which had to pay a $1.35 million fine to the FCC as a result of the practice.
Fingerprinters These cross-site trackers follow users by creating a unique profile of their device. They collect things like the person’s IP address, their screen resolution, and what type of computer they have.
Identity trackers Instead of using a cookie, these rare trackers follow people using personally identifiable information, such as their email address. They collect this data by hiding on login pages where people enter their credentials.
Session cookies Some trackers are good! These helpful same-site scripts keep you logged in to websites and remember what’s in your shopping cart—often even if you close your browser window.
Session replay scripts Some same-site scripts can be incredibly invasive. These record everything you do on a website, such as which products you clicked on and sometimes even the password you entered.
The next year, President Lyndon Johnson’s administration proposed merging hundreds of federal databases into one centralized National Data Bank. Congress, concerned about possible surveillance, pushed back and organized a Special Subcommittee on the Invasion of Privacy. Lawmakers worried the data bank, which would “pool statistics on millions of Americans,” could “possibly violate their secret lives,” The New York Times reported at the time. The project was never realized. Instead, Congress passed a series of laws governing the use of personal data, including the Fair Credit Reporting Act in 1970 and the Privacy Act in 1974. The regulations mandated transparency but did nothing to prevent the government and corporations from collecting information in the first place, argues technology historian Margaret O’Mara.
Toward the end of the 1960s, some scholars, including MIT political scientist Ithiel de Sola Pool, predicted that new computer technologies would continue to facilitate even more invasive personal data collection. The reality they envisioned began to take shape in the mid-1990s, when many Americans started using the internet. By the time most everyone was online, though, one of the first privacy battles over digital data brokers had already been fought: In 1990, Lotus Corporation and the credit bureau Equifax teamed up to create Lotus MarketPlace: Households, a CD-ROM marketing product that was advertised to contain names, income ranges, addresses, and other information about more than 120 million Americans. It quickly caused an uproar among privacy advocates on digital forums like Usenet; over 30,000 people contacted Lotus to opt out of the database. It was ultimately canceled before it was even released. But the scandal didn’t stop other companies from creating massive data sets of consumer information in the future.
Several years later, ads began permeating the web. In the beginning, online advertising remained largely anonymous. While you may have seen ads for skiing if you looked up winter sports, websites couldn’t connect you to your real identity. (HotWired.com, the online version of WIRED, was the first website to run a banner ad in 1994, as part of a campaign for AT&T.) Then, in 1999, digital ad giant DoubleClick ignited a privacy scandal when it tried to de-anonymize its ads by merging with the enormous data broker Abacus Direct.
Privacy groups argued that DoubleClick could have used personal information collected by the data broker to target ads based on people’s real names. They petitioned the Federal Trade Commission, arguing that the practice would amount to unlawful tracking. As a result, DoubleClick sold the firm at a loss in 2006, and the Network Advertising Initiative was created, a trade group that developed standards for online advertising, including requiring companies to notify users when their personal data is being collected.
But privacy advocates’ concerns eventually came true. In 2008, Google officially acquired DoubleClick, and in 2016 it revised its privacy policy to permit personally-identifiable web tracking. Before then, Google kept its DoubleClick browsing data separate from personal information it collected from services like Gmail. Today, Google and Facebook can target ads based on your name—exactly what people feared DoubleClick would do two decades ago. And that’s not all: Because most people carry tracking devices in their pockets in the form of smartphones, these companies, and many others, can also follow us wherever we go.
Tumblr media
The Future of Personal Data Collection
Personal information is currently collected primarily through screens, when people use computers and smartphones. The coming years will bring the widespread adoption of new data-guzzling devices, like smart speakers, censor-embedded clothing, and wearable health monitors. Even those who refrain from using these devices will likely have their data gathered, by things like facial recognition-enabled surveillance cameras installed on street corners. In many ways, this future has already begun: Taylor Swift fans have had their face data collected, and Amazon Echos are listening in on millions of homes.
We haven’t decided, though, how to navigate this new data-filled reality. Should colleges be permitted to digitally track their teenage applicants? Do we really want health insurance companies monitoring our Instagram posts? Governments, artists, academics, and citizens will think about these questions and plenty more.
And as scientists push the boundaries of what’s possible with artificial intelligence, we will also need to learn to make sense of personal data that isn’t even real, at least in that it didn’t come from humans. For example, algorithms are already generating “fake” data for other algorithms to train on. So-called deepfake technology allows propagandists and hoaxers to leverage social media photos to make videos depicting events that never happened. AI can now create millions of synthetic faces that don’t belong to anyone, altering the meaning of stolen identity. This fraudulent data could further distort social media and other parts of the internet. Imagine trying to discern whether a Tinder match or the person you followed on Instagram actually exists.
Source link
Tags: Business, Collection, Complete, Data, Google, Guide, Personal, Privacy, WIRED, Wired Guide
from WordPress https://ift.tt/3cSwztq via IFTTT
0 notes
kindlecomparedinfo · 5 years
Text
Will the quantum economy change your business?
Google and NASA have demonstrated that quantum computing isn’t just a fancy trick, but almost certainly something actually useful — and they’re already working on commercial applications. What does that mean for existing startups and businesses? Simply put: nothing. But that doesn’t mean you can ignore it forever.
There are three main points that anyone concerned with the possibility of quantum computing affecting their work should understand.
1. It’ll be a long time before anything really practical comes out of quantum computing.
Google showed that quantum computers are not only functional, but apparently scalable. But that doesn’t mean they’re scaling right now. And if they were, it doesn’t mean there’s anything useful you can do with them.
What makes quantum computing effective is that it’s completely different from classical computing — and that also makes creating the software and algorithms that run on it essentially a completely unexplored space.
Quantum computing’s ‘Hello World’ moment
There are theories, of course, and some elementary work on how to use these things to accomplish practical goals. But we are only just now arriving at the time when such theories can be tested at the most basic levels. The work that needs to happen isn’t so much “bringing to market” as “fundamental understanding.”
Although it’s tempting to equate the beginning of quantum computing to the beginning of digital computing, in reality they are very different. Classical computing, with its 1s and 0s and symbolic logic, actually maps readily on to human thought processes and ways of thinking about information — with a little abstraction, of course.
Quantum computing, on the other hand, is very different from how humans think about and interact with data. It doesn’t make intuitive sense, and not only because we haven’t developed the language for it. Our minds really just don’t work that way!
So although even I can now claim to have operated a quantum computer (technically true), there are remarkably few people in the world who can say they can do so deliberately in pursuit of a specific problem. That means progress will be slow (by tech industry standards) and very limited for years to come as the basics of this science are established and the ideas of code and data that we have held for decades are loosened.
2. Early applications will be incredibly domain-specific and not generalizable.
A common misunderstanding of quantum computing is that it amounts to extremely fast parallel processing. Now, if someone had invented a device that performed supercomputer-like operations faster than any actual supercomputer, that would be an entirely different development and, frankly, a much more useful one. But that isn’t the case.
As an engineer explained to me at Google’s lab, not only are quantum computers good at completely different things, they’re incredibly bad at the things classical computers do well. If you wanted to do arithmetical logic like addition and multiplication, it would be much better and faster to use an abacus.
Part of the excitement around quantum computing is learning which tasks a qubit-based system is actually good at. There are theories, but as mentioned before, they’re untested. It remains to be seen whether a given optimization problem or probability space navigation is really suitable for this type of computer at all.
What they are pretty sure about so far is that there are certain very specific tasks that quantum computers will trivialize — but it isn’t something general like “compression and decompression” or “sorting databases.” It’s things like evaluating a galaxy of molecules in all possible configurations and conformations to isolate high-probability interactions.
As you can imagine, that isn’t very useful for an enterprise security company. On the other hand, it could be utterly transformative for a pharmacology or materials company. Do you run one of those? Then in all likelihood, you are already investing in this kind of research and are well aware of the possibilities quantum brings to the table.
But the point is these applications will not only be very few in number, but difficult to conceptualize, prove, and execute. Unlike something like a machine learning agent, this isn’t a new approach that can easily be tested and iterated — it’s an entirely new discipline which people can only now truly begin to learn.
from RSSMix.com Mix ID 8176395 https://techcrunch.com/2019/10/28/will-the-quantum-economy-change-your-business/ via http://www.kindlecompared.com/kindle-comparison/
0 notes
sfpcschool · 7 years
Text
Real Talk: the Art of Being an Artist with Taeyoon Choi
Blog post by Melanie Hoff
Tumblr media
On the final day of Code Narratives, SPFC’s 2017 two-week summer session, SFPC co-founder Taeyoon Choi guided the class through his personal approach to poetic computation and opened a window into the often unnecessarily murky world of supporting oneself as an artist. 
Poetic Computation
Tumblr media
Taeyoon begins by inviting the class to deeply consider; at it’s most fundamental, what is a computer, really? 
He suggests that the minimum viable computer performs three functions. That of a clock for automation, an abacus for calculations, and a notepad for storing records. In other words, 
“a computer tells time, it adds, and it remembers.”
Tumblr media
“... but computers are not neutral objects for aesthetic contemplation. Their roots are in war machines, contested politics, and precarious lives.“
In his project, Handmade Computer, Taeyoon investigates whether a computer can become divorced from it’s history as a tool for war and as a product of capitalism. In becoming intimate with computing at it’s most basic level and in building a computer with his hands, Taeyoon has shifted his personal, even bodily, relationship to computers. 
Can the heritage and narrative of computers be rewritten?
Tumblr media
Taeyoon describes how everyday acts like checking Gmail and opening facebook all become translated into 0’s and 1’s by our computers.
Tumblr media
Real Talk: the Art of Being an Artist 
For the second part of Taeyoon’s class, students received comprehensive guidance, aided by clarifying examples on many key aspects of an artist’s life. The topics covered included how to apply to (and deal with rejection from) artist residencies and grants, getting paid as an artist, and developing your artist statement. The class materials are organized in this repository.
The economics of sustaining and growing an art practice is rarely talked about with such openness and generosity. 
Tumblr media
For artists, writing is incredibly important. Not only for proposals, residencies, and grants, but for crafting a narrative around yourself as an artist. Almost no one can write about your work better than you. 
Done well, artist’s writing reaches far beyond describing work. It expands it. 
Tumblr media
To practice this, Taeyoon facilitated an in-class writing exercise where each student was asked to pick one of their projects and write about it using the following framework:
Write about one of your projects in 200 words.
What do you see, read, hear from your piece? Describe the experience of the work.
How is it made? Describe the production. What is the material, tool, and the process? Is it relevant and important for the piece?
What did you try to achieve with it? What is the message or question, if any. If there’s no message, what was the impulse that drove you to make this?
Tumblr media
The very next day was the summer salon event where all the students presented their work to each other along with a packed house full of SFPC alumni and friends. It was amazing to see how the ideas and skills from each of the five Code Narratives classes were synthesized in distinct and poetic ways for each student. 
In two short weeks the students had performed arithmetic with colors, collaborated with a Markov chain, pondered the way legal codes intersect with algorithms, generated melodies from tweets, and learned what it takes to get paid for your art.
It's been a great summer with wonderful people here at SFPC. See you in the Fall!
Tumblr media Tumblr media Tumblr media Tumblr media
7 notes · View notes
eliashiebert · 8 years
Text
Thony Christie on ancient number systems:
The Babylonian sexagesimal system is the reason why we have sixty minutes in an hour, sixty seconds in a minute, sixty minutes in a degree and so forth. It is not however, contrary to a widespread belief the reason for the three hundred and sixty degrees in a circle; this comes from the Egyptian solar years of twelve thirty day months projected on to the ecliptic, a division that the Babylonian then took over from the Egyptians.
[ . . . ]
Of interest is the well-known IV instead of IIII for four was first introduced in the Middle Ages.
[ . . . ]
When compared with the Hindu-Arabic number system the Greek and Roman systems seem to be cumbersome and the implied sneer in Professor Evans’ tweet seems justified. However there are two important points that have to be taken into consideration before forming a judgement about the relative merits of the systems. Firstly up till the Early Modern period almost all arithmetic was carried out using a counting-board or abacus, which with its columns for the counters is basically a physical representation of a place value number system. […] A skilful counting-board operator can not only add and subtract but can also multiply and divide and even extract square roots using his board so he has no need to do written calculation. He just needed to record the final results. The Romans even had a small hand abacus or as we would say a pocket calculator. The words to calculate, calculus and calculator all come from the Latin calculi, which were the small pebbles used as counters on the counting board. In antiquity it was also common practice to create a counting-board in a sand tray by simply making parallel groves in the sand with ones fingers. […] Moving away from the counting-board to written calculations it would at first appear that Professor Evans is correct and that multiplication and division are both much simpler with our Hindu-Arabic number system than with the Roman one but this is because we are guilty of presentism. In order to do long multiplication or long division we use algorithms that most of us spent a long time learning, often rather painfully, in primary school and we assume that one would use the same algorithms to carry out the same tasks with Roman numerals, one wouldn’t.
[ . . . ]
How do we [multiply and divide] CXXV times XXXVII? The algorithm we use comes from the Papyrus Rhind an ancient Egyptian maths textbook dating from around 1650 BCE and is now known as halving and doubling because that is literally all one does. The Egyptian number system is basically the same as the Roman one, strokes and bundles, with different symbols. We set up our numbers in two columns. The left hand number is continually halved down to one, simple ignoring remainders of one and the right hand is continually doubled.
1 XXXVII CXXV 2 XVIII CCXXXXVV=CCL 3 VIIII CCCCLL=CCCCC=D 4 IIII DD=M 5 II MM 6 I MMMM
You now add the results from the right hand column leaving out those were the number on the left is even i.e. rows 2, 4 and 5. So we have CXXV + D + MMMM = MMMMDCXXV. All we need to carry out the multiplication is the ability to multiply and divide by two! Somewhat simpler than the same operation in the Hindu-Arabic number system!
Division works by an analogous algorithm. So now to divide 4625 by 125 or MMMMDCXXV by CXXV
1 I CXXV 2 II CCXXXXVV=CCL 3 IIII CCCCLL=CCCCC=D 4 IIIIIIII=VIII DD=M 5 VVIIIIII=XVI MM 6 XXVVII=XXXII MMMM
We start with 1 on the left and 125 on the right and keep doubling both until we reach a number on the right that when doubled would be greater than MMMMDCXXV. We then add up those numbers on the left whose sum on the right equal MMMMDCXXV, i.e. rows 1, 3 and 6, giving us I+IIII+XXXII = XXXIIIIIII = XXXVII or 37.
[ . . . ]
Interestingly the ancient Egyptian halving and doubling algorithms for multiplication and division are, in somewhat modified form, how modern computers carry out these arithmetical operations.
…And from the comments:
viktorblasjo:
Why, then, did Hindu-Arabic numerals overtake the older systems in Europe, in a culture very well versed in abacus methods?
Thony C.:
The answer is in the post if somewhat hidden. It’s when people started writing numbers extensively i.e. with the introduction of double-entry bookkeeping. If we take a relatively simply number like 1988 in Roman numerals it becomes MDCCCLXXXVIII. The first advantage of Hindu-Arabic numerals become immediately obvious.
Also as indirectly pointed out by Brian Clegg in his snarky comment, you can’t do decimal fractions in Roman numerals, which is why my version of Richards Evans’ challenge ends with ‘remainder 16’. This is also the reason why astronomers retained the Babylonian sexagesimal number system up to the sixteenth/seventeenth century.
Even with the Hindu-Arabic number system the Europeans didn’t really work out how to do decimal fractions until the early seventeenth century.
1 note · View note
craigbrownphd · 5 years
Text
Fresh from the Python Package Index
• shamu Launches interactive dockers • shapeguard ShapeGuard is a tool to help with handling shapes in Tensorflow. • silk-ml Simple Intelligent Learning Kit (SILK) for Machine learning. In the area of machine learning and data science, the most relevant is data management and … • sktime-dl Deep learning extension package for sktime, a scikit-learn compatible toolbox for learning with time series data • squyrrel Python library for parsing and combining different data resources • tf2onnx-xzj Tensorflow to ONNX converter • umap-learn-modified Forked from umap-learn (https://…/umap ). Change API so that UMAP accepts precomputed KNN … • vtorch NLP research library, built on PyTorch. Based on AllenNLP. • zocalo-dls Standard components for automated data processing with Zocalo at Diamond Light Source • PyTorch The package named for PyTorch is ‘torch’. • Torch Tensors and Dynamic neural networks in Python with strong GPU acceleration • adanet adanet is a lightweight and scalable TensorFlow AutoML framework for training and deploying adaptive neural networks using the AdaNet algorithm [Cortes … • abacus-tpot TPOT’s analysis library • AAdeepLearning AAdeepLearning is a deep learning frame • awsaccountmgr A command line tool for managing accounts within an AWS organization. Easy to integrate into AWS Deployment Framework http://bit.ly/2M6NvBW
0 notes
un-enfant-immature · 4 years
Text
RealityEngines.AI becomes Abacus.AI and raises $13M Series A
RealityEngines.AI, the machine learning startup co-founded by former AWS and Google exec Bindu Reddy, today announced that it is rebranding as Abacus.AI and launching its autonomous AI service into general availability.
In addition, the company also today disclosed that it has raised a $13 million Series A round led by Index Ventures’ Mike Volpi, who will also join the company’s board. Seed investors Eric Schmidt, Jerry Yang and Ram Shriram also participated in this oversubscribed round, with Shriram also joining the company’s board. New investors include Mariam Naficy, Erica Shultz, Neha Narkhede, Xuezhao Lan and Jeannette Furstenberg. 
This new round brings the company’s total funding to $18.25 million.
Abacus.AI’s co-founders Bindu Reddy, Arvind Sundararajan and Siddartha Naidu (Image Credits: Abacus.AI)
At its core, RealityEngines.AI’s Abacus.AI’s mission is to help businesses implement modern deep learning systems into their customer experience and business processes without having to do the heavy lifting of learning how to train models themselves. Instead, Abacus takes care of the data pipelines and model training for them.
The company worked with 1,200 beta testers and in recent months, the team mostly focused on not just helping businesses build their models but also put them into production. Current Abacus.AI customers include 1-800-Flowers, Flex, DailyLook and Prodege.
“My guess would be that out of the hundred projects which are started in ML, one percent succeeds because of so many moving parts,” Reddy told me. “You have to build the model, then you have to test it in production — and then you have to build data pipelines and have to put in training pipelines. So over the last few weeks even, we’ve added a whole bunch of features to enable putting these things to go into production more smoothly — and we continue to add to it.”
Image Credits: Abacus.AI
In recent months, the team also added new unsupervised learning tools to its lineup of pre-built solutions to help users build systems for anomaly detection around transaction fraud and account takeovers, for example.
The company also today released new tools for debiasing data sets that can be used on already trained algorithms. Automatically building training sets — even with relatively small data sets — is one of the areas on which the Abacus team has long focused, and it is now using some of these same techniques to tackle this problem. In its experiments, the company’s facial recognition algorithm was able to greatly improve its ability to detect whether a Black celebrity was smiling or not, for example, even though the training data set featured 22 times more white people.
Image Credits: Abacus
With today’s launch, Abacus is also launching a new section on its website to showcase models from its community. “You can go build a model, tweak your model if you want, use your own data sets — and then you can actually share the model with the community,” Reddy explained, and noted that this is now possible because of Abacus’ new pricing model. The company has decided to only charge customers when they put models into production.
Image Credits: Abacus.ai
The next major item on the Abacus roadmap is to build more connectors to third-party systems so that users can easily import data from Salesforce and Segment, for example. In addition, Reddy notes that the team will build out more of its pre-built solutions, including more work on language understanding and vision use cases.
To do this, Abacus has already hired more research scientists to work on some of its fundamental research projects, something Reddy says its funders are quite comfortable with, and more engineers to put that work into practice. She expects the team will grow from 22 employees today to about 35 by the end of the year.
RealityEngines launches its autonomous AI service
0 notes
Text
Between lectures, presentation, coursework assignments and photography, it’s been impossible to keep up with blogging. This post has been due since November but I just couldn’t find the put it all together for you, but I’m here now.
Computers and information technology have become an integral part of our lives in the digital age. Just about everything we do, have or enjoy has been defined and made possible by computers, but the history of these powerful resources, their influence on politics, and the rest of the world hasn’t gotten the attention it deserves. Even more understated is the importance and power of intellectual research endeavours and the fundamental difference the glasses-wearing, library-loving, coffee-guzzling academics contribute to historical events.
The National Museum of Computing tells this vivid story by collecting, restoring, and displaying historic computer systems. Located in Bletchley Park in Milton Keynes, Buckinghamshire (two hours from the University of Reading Whiteknights Campus) the National Museum of Computing opened in 2007 and hosts a variety of old technologies, from arm-size satellite phones to memory discs the size of a coffee table.
The Colossus is the first programmable computer. This was intriguing primarily because it looks nothing like the computers we have today, lacking a screen and keyboard. It was used to help decipher encrypted radio messages from the Germans during World War II. It was interesting to realise that the war wasn’t just won by brute-force and violence alone, hardworking researchers contributed by developing an information system to save lives and eventually bring an end to Hitler’s war.
For insight on this, you should see The Imitation Game, a brilliant movie detailing how the British intelligence agency recruited  Cambridge University mathematician and logician Alan Turing to crack Nazi codes which were thought to be unbreakable. If you haven’t, you can watch the trailer here.
youtube
I love this movie so you can imagine how awestruck I was just to be in the same room as the Turing-Welchman Bombe, the one that that changed the history of the world. The Bombe is a massive and incredibly complex device covered with what seems like hundreds of knobs, used by cryptologists to decipher secret German messages I was so glad cameras were allowed and I couldn’t stop shooting. The lighting in the museum was limited so I didn’t come out eith the best shots or angles.
The Lorenz machines which the German Army used to encrypt messages during World War II were also in display. So in my head, I could see German soldiers stabbing away at these keys and British academics rubbing thier chins and waiting fot the Bombe to come through. The feeling of somehow being part of that was immense.
Lorenzo Machine
World War II.
I also got to see the WITCH! The Wolverhampton Instrument for Teaching Computing from Harwell (WITCH) is the world’s oldest working digital computer! I could tell you all about it, but i think it will be more engaging to just watch this video.
youtube
Wolverhampton Instrument for Teaching Computing from Harwell (WITCH)
Wolverhampton Instrument for Teaching Computing from Harwell (WITCH)
Harwell Dekatron Computer
The museum tour was a lot like walking through a time capsule and watching technology evolve right before our eyes. We got to compare a vintage telephone with our touchscreen smartphones, and it was amazing how such a bulky rudimentary device has evolved to become small, sleek, versatile, and infinitely faster.
Old telephone
Abacus
Film tape
Old Hard Disk
We got to experience first-hand the challenges associated with programming in the 1980s using the first home computer, 8-bit BBC Micro. These computers, although more familiar than the Bombe or Colossus, were unbelievably slow and did not have the backspace button; consequently, if you made a mistake you had to start over. Afterwards, we used current laptops to toy with the Turing test, developed by Alan Turing in 1950. The Turing test evaluates a machine’s ability to show intelligence comparable to that of a human being and is the basis of Artificial Intelligence (AI) today. It was heartbreaking to learn that Alan went to prison for being gay despite his contribution to his country and the world with his work in AI. I don’t see how being gay is a punishable offence but we will have this conversation in another blog post.
My snake game
Old IBM computers I believe
Our last stop was the ‘Women in Computing’ gallery, which became an easy favourite. It recognized women’s contributions in the male-dominated field of information technology. Women were among the first programmers in history and made note-worthy contributions to this field, with Ada Lovelace who designed the first algorithm to be run by a computer. As a woman in tech, this gallery was deeply motivating and reassuring of the fact that I have a place here, perhaps not in a museum but I too can make a difference through research and academia.
Women in Computing
Women in Computing
Women in Computing
Want more? Watch a tour bellow
youtube
Summarily, it was a fascinating experience. If you are into computers and digital technology, I absolutely recommend you pay this museum a vist. Seeing what over 50 years of innovation in information technology looks like made me realise that current computing technology will be considered obsolete in a few decades and the coming generations will wonder how we put up with them. I was so inspired, it’s the first time I’ve had the desire to live forever just so I can see what wonders the great minds of this world will come up with, and how these advancement services will shape our lives. It was a strong reminder that there are no superheroes just ordinary, hardworking people trying to make a difference in the work.
Have you been to the National Museum of Computing? What was your favourite find? Tell me about it in the comments! *** Liked this post? Do share it on your social media wall, timeline or feed Want blog updates and promotions in your inbox Sign Up Now
Touring the National Museum of Computing, Bletchley, England Between lectures, presentation, coursework assignments and photography, it's been impossible to keep up with blogging. This post has been due since November but I just couldn't find the put it all together for you, but I'm here now.
0 notes
zipgrowth · 6 years
Text
Judge, Jury and Education Startups: Reflections From the SXSW EDU Launch Competition
All the world’s a stage when you’re a startup, and life becomes a pitch in front of investors, advisers, reporters, partners and potential acquirers.
Those hats were represented in the panel of judges at this week’s SXSW EDU Launch Competition, which included Bridget Burns, executive director of the University Innovation Alliance; Vince Chan, co-founder at Creta Ventures; Jonathan Rochelle, product management director at Google for Education; and your correspondent.
The annual tradition dates back to 2012, and features early-stage companies showing off the latest efforts to solve intractable problems across the education landscape. Eight startups took the stage this year.
Over the years I have attended dozens of demo days and exhibitions, and slogged through what already feels like a lifetime of pitches in my inbox. Yet this was my first time participating as a judge, a role that forced me to pay extra close attention.
Some things were once simpler: Edtech pitches used to concentrate on either the K-12 or higher education market. Others, like the repackaging of buzzwords and branding tactics, stay the same. And ageless problems, like helping kids read and pay for college, remain in need of solutions.
Here are my reflections from playing edtech judge for a morning.
Education Is Encompassing Everything
“So where’s the education in this?”
That’s the question that Google’s Rochelle posed to Chelsea Sprayregen, CEO of Pie for Providers—and not because the startup’s name sounds like food. The company offers a suite of administrative software to help daycare providers execute a variety of tasks, from managing government subsidies to financial bookkeeping.
It was an honest, earnest question—but one that raised broader questions about whether there’s a meaningful distinction between childcare and education. Certainly they overlap; few would deny that childcare services make an impact on a child’s development.
It’s a gray area that education funders, including government agencies and venture capitalists, are increasingly dabbling in. Investors at another SXSW EDU session noted that, since 2016, government spending on these programs has increased 17 percent, and private funding has risen 12 percent.
Questions similar to Rochelle’s surface in my mind when it comes to hiring and recruiting tools. That’s what UpKey pitched—a service that helps students build stronger resumes and “connects them to employers looking for students with grit,” according to its flyer. The emergence of these tools in “edtech” reflects the belief that perhaps the most tangible and important benefit of education may be to get a good job.
AI Is the New ‘Adaptive’
“Adaptive” and “personalized” were once ubiquitous in pitches and press releases. The new magic these days is artificial intelligence, or AI. Call it what you want. But rare is the company that takes care to explain how those technologies actually work in their products.
The first question I asked went to Mark Angel, CEO of Amira Learning, about how the AI works in his company’s reading-assistant tool. While it’s unrealistic to dive into details about algorithms and machine learning within the 5 minutes allowed for Q&A, he did at least articulate a mechanism for how the tool collects and labels data, and uses that data to continually train and fine-tune its system. (I often refer to this piece as a primer of questions to ask about AI.)
Another “AI-powered” startup present was ROYBI, which is developing a machine learning-powered robot companion for children. The company claimed it is capable of a wide range of things, from conversing with toddlers and teaching languages and “STEAM” concepts, to being able to react to their emotions and send progress reports to parents. It was a long list of checkboxes for a product that’s still in development.
Not All Ideas Translate
Ideas that sound like viable businesses in one part of the world may not translate as well to others. That was one of the challenges that emerged in the pitch for SoroTouch, which offers an app and also runs tutoring centers that teach the abacus method to do calculations. (The method is apparently very effective.) According to the company’s founder, the tool is popular among the many cram schools that dot Japan, where the company is based.
But in front of a panel of U.S. judges, the value of calculations and cram schools seemed to miss the mark. As a Kumon alumnus, I certainly appreciate all the practical use cases for quick mental calculations (especially when it comes time to split the dinner bill). When it comes to math education, however, the trend is focusing on deeper conceptual understanding and application to real-world problems. As math reformer Conrad Wolfram suggests, why not let computers handle the computations?
The ‘X’ for ‘Y’
“Netflix for education.” “Uber for tutors.” It’s a common and catchy marketing tactic to align one’s service with a popular brand. It can be risky as well. “Facebook for education” just doesn’t evoke the same fuzzy feelings as it used to.
That pitching tactic remains alive and well. At the competition was Caribu, which billed itself as a “FaceTime meets podcast” service that aims to help to connect parents and children via an interactive video call for reading and drawing activities. Think live screen-sharing with digital books that one can also doodle on. There was also Giide, which its founders described as podcasts for professional learning. “Learning must be reshaped to fit our lifestyle,” so goes the company’s flyer, which presumably means listening to a lot of bite-sized audio lessons on the go.
Still Trying to Afford College
Money talks, and when it comes to the cost of higher education, the issue still screams for attention and solutions. Edmit, which provides tools to help students and families research higher-ed costs and find financial aid opportunities, won the Launch competition. The startup claims it can provide more accurate cost estimates based on personal, geographical and publicly available data sets. It all sounded enticing enough to get Bridget Burns to ask: “Why hasn’t the College Board acquired you?”
Judge, Jury and Education Startups: Reflections From the SXSW EDU Launch Competition published first on https://medium.com/@GetNewDLBusiness
0 notes
linabrigette · 6 years
Text
Verge’s Blockchain Attacks Are Worth a Sober Second Look
The notorious 51-percent attack: it’s the major fault in cryptocurrency protocols but it’s rarely seen, especially among the most popular cryptocurrencies.
Yet, in the past couple months, the exploit – whereby a single miner (or group of miners) takes control of over half of the network’s total computing power and can then bend the protocol’s rules in their favor – has been seen twice. And on the same blockchain.
Indeed, verge, a privacy-oriented cryptocurrency recently propelled into the limelight by a partnership with popular adult entertainment site Pornhub, suffered two hacks perpetrated through 51-percent attacks that saw the attackers absconding with millions of dollars-worth of its native cryptocurrency, XVG.
During the first attack in April (only a couple of weeks before the Pornhub partnership), the hacker was able to get away with 250,000 XVG. And during the latest in mid-May, an attacker was able to exploit $1.7 million-worth of the cryptocurrency from the protocol.
According to researchers, the exploits are a product of simple changes to the underlying code which cryptocurrency protocols are typically built on and the challenges of being able to predict what unintended consequences will arise from those changes.
Sure, verge developers were only trying to design a better cryptocurrency for payments, but by tweaking small parameters, such as the length of time a block can be valid, the group has opened its blockchain up to attacks.
“Getting incentives right and keeping them right is hard,” Imperial College London assistant professor and Liquidity Network founder Arthur Gervais said.
That is blockchains are built on very precariously stacked incentives whereby all stakeholders work together toward a common goal so as to remove the chance that one entity takes full control.
“Things obviously don’t look good,” said Daniel Goldman, the CTO of cryptocurrency analysis site The Abacus who’s been tracking the attacks. “The issues that initially slipped into the codebase were a result of pure carelessness — incorporating code from other open-source software without understanding its implications.”
Goldman added:
“I hate to say it, but if I had to summarize: the attacker is doing better due diligence than the developers. I’d try to poach him if I were them.”
And since veteran blockchain developers, including litecoin creator Charlie Lee and monero lead developer Riccardo Spagni, have long argued the kinds of adjustments the platform made have obvious downsides, such naysayers – who have been readily attacked by a group of enthusiasts calling themselves the “Verge Army” – are feeling vindicated.
“So many important lessons to be learned from this,” Fidelity investment research analyst Nic Carter tweeted, summing up the general state of verge’s development.
Representatives from the verge developer team did not respond to a request for comment from BTC News Today.
The problem
One of those lessons is that there are reasons why the window of time that a transaction can be valid is limited quite strictly.
For instance, whereas bitcoin transactions are only valid for about 10 minutes before they’re verified in a block, verge developers extended that window to two hours. And because there is some information asymmetry in blockchain systems since nodes are spread out across the globe, the attacker was able “spoof” timestamps tied to blocks without some noticing, according to the widely-circulated post by Goldman.
But it wasn’t just that; another piece of the attacks was verge’s difficulty algorithm.
Verge uses the algorithm “Dark Gravity Wave” to automatically adjust how fast miners find blocks. In verge, this happens every two hours; compared to bitcoin which adjusts every two weeks, verge’s algorithm is quite fast.
The spoofed timestamps paired with this fast-adjusting algorithm led to the problem of “tragically confusing the protocol’s mining adjustment algorithm,” as Goldman put it.
Or said another way, the attacker cleverly mined blocks with fake timestamps, forcing the cryptocurrency’s difficulty to adjust down more quickly – making it easier for the attacker to mine even more XVG.
When the first attack happened, verge developers quickly released a patch, stopping the attacker from printing more money. Yet, with the attack last month, it seems the patch only went so far and the attacker found another way to execute the same hack, displaying how difficult it can be to architect a distributed system that isn’t vulnerable to attacks.
Continuing attacks
And according to Goldman, the issues for verge are likely not over.
“An attack clearly was – and maybe still is – being attempted. So far, however, the would-be attacker hasn’t managed to overtake the network,” Goldman told BTC News Today.
But he continued:
“As it stands now, two of the three (in my opinion) fundamental sources of vulnerabilities have been mitigated at best, and one remains completely unfixed.”
While no XVG were stolen directly from users, miners on the network aren’t supposed to be able to bend the rules like this, effectively printing money for one individual in a short period of time.
As such, verge developers are actively working on improving the code. After a period of little communication from verge’s developers, CryptoRekt, the pseudonymous author of the verge “blackpaper” took to Reddit on May 31, saying, that all of the verge team would “never intentionally do anything to besmirch or hurt this project.”
He added that the project’s developer have been working on new code for “several weeks” to “solidify our currency against any future attacks.”
Yet, Goldman believes there’s another problem. Unlike many of the cryptocurrency projects out there today, which rely on open-source code, verge’s codebase is being constructed in private and so will not get peer-reviewed by the community of blockchain experts that could help the team find vulnerabilities.
“Since incorporating code without responsibly vetting it was the thing that led to all this, this should make the vergefam nervous,” he tweeted.
Verge’s future?
But so far, much of the verge community remains supportive of the developer team and the cryptocurrency’s mission.
Pseudonymous verge user Crypto Dog went as far as to claim that “there is no need to panic,” contending that verge’s success will continue no matter what. And CryptoRekt chose to see it as a learning experience, one that would help verge “build a bigger and better project.”
Still, this attack looks poorly, not only on verge itself, but also on organizations that have partnered with the verge team, Pornhub included. Especially since Pornhub’s vice president Corey Price stated verge was chosen as a payment method for the site in a “very deliberate selection process” to preserve the financial privacy of their customers.
As such, some developers believe this episode will bring about a heightened sense of responsibility for many organizations to more effectively analyze a blockchain before adopting it.
“I wouldn’t be surprised by more scrutiny in the near future, both leading to more attacks and to investors more accurately rating the value proposition of smaller altcoin projects,” BitGo engineer Mark Erhardt said, adding:
“The absence of an attack is not proof that a system is safe. Quite a few altcoin projects appear to be taking unsafe shortcuts. It’s just that nobody has bothered to exploit these systemic flaws or weaknesses, yet.”
As such, verge might be the first in a long line of future exploits.
While 51-percent attacks have typically been viewed as hard to execute, Liquidity Network’s Gervais argued that new data appears to show that it’s easier than many previously thought. He pointed to a new web app, 51crypto, which tracks how profitable it is to execute a 51-percent attack on various blockchains.
The gist of the statistics is, the smaller the blockchain, the easier it is to overtake it and bend the rules, which is why developers need to be particularly careful in how they architect their systems.
Because “if an attack makes more economic sense over honest behavior, the attackers will be there,” Gervais concluded.
Verge image via Shutterstock
The leader in blockchain news, BTC News Today is a media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. BTC News Today is an independent operating subsidiary of Digital Currency Group, which invests in cryptocurrencies and blockchain startups.
from WordPress https://ift.tt/2sMuFFE via IFTTT
0 notes