#with new ideas to make them more easily writable
Explore tagged Tumblr posts
Text
Me: I would really like to begin writing these specific stories from my idea list.
Me, after like one day of serious brainstorming: What if instead you worked on every single other idea you've ever had?
#the trouble with trying to work on longer stories is the commitment factor#if i start on a short story i'm commiting to one day to about a week#a longer work is several weeks#so the minute i try to commit my brain tells me a million other ideas will be more interesting or easier#at the start of the month my main contenders were one arateph idea and the lost library cinderella#but now i keep shying away from those#a goose girl idea was coming forward as a possibility#but i keep wavering between making it a short or a novella#and now two separate retellings of that tale are fighting for focus#as i can't decide which details to use from each one#and this weekend every single *other* arateph retelling idea i've had came forward#with new ideas to make them more easily writable#and i'm sure that every one will fade before i get a chance to write a word#i guess all i can do is go along for the ride#adventures in writing
12 notes
·
View notes
Text
ColorMax Printing Produces Plastic gift cards and custom gift cards with advanced technology
Plastic gift cards are an expanding business that’s obtaining larger every year. Greater than two-thirds of consumers use the cards, fueling a market that grew to $250 billion in 2015. Offering plastic gift cards, the section is growing well past vacation use. With so numerous clients making use of the cards these days, a lot more stores have turned to gift card printing as a popular means to entice customers.
Custom gift cards play an indispensable role in the shopping experience of today’s consumers and are one of the fastest growing segments of the gift industry. Offering a plastic gift card gives your customers an easy choice when they need a gift and guarantees that those who receive your gift card will patronize your business and become potential repeat customers.
Increase customer loyalty with renewable balance plastic gift cards.
Custom gifts cards are printed with a magnetic strip or bar code on which you apply a dollar value. You control the value of the card electronically, generating a renewable balance and creating opportunities for repeat shopping, increased customer loyalty, in-store promotions and cross selling. You also benefit from potential impulse buying and up-selling since your customers will be inclined to apply the gift card value towards higher priced merchandise. Best of all, your electronically controlled gift card means no cash refunds on the balance.
Don’t miss out on the expanding gift card trend.
Plastic gift cards are not just for the big retailers anymore. Since most shoppers are now accustomed to the convenience of gift cards, they expect all merchants and business owners to offer them. Colormaxprinting provides custom plastic gift cards to a wide range of businesses, large and small, including: grocery, pet and gift stores; boutiques and specialty shops; florists and garden shops; automotive, oil changers and car washes; dentists and chiropractors; health clubs and golf courses; clothing and jewelry stores and beauty shops; restaurants and night clubs; and equipment rental stores.
Plastic gift cards are among the major vacation as well as special occasion acquisitions each year. The reason: convenience. They’re very easy to acquire and also they’re very easy to redeem. You see and also could also have a number of cheap gift cards in you pocket. With so lots of customers gift card printing these days, so much more retailers have turned to plastic gift cards as a prominent means to attract clients.
Gift card printing
ColorMaxPrinting, is a card manufacturer that produces plastic gift cards in small or large quantities. Gift card printing directly with us you can save up to 50% than local printing store. Custom gift cards but excellent quality and service, that’s what we offer. Simply send us details of your whole colors and any graphics and we’ll do the remainder.
The advantages of gift cards implementation:
Plus for undecided customers, it is easier to give a gift card so that the recipient chooses the best gift for himself.
In the case of a return of the product, the return can only take place on the gift card, so that the customer can shop again.
If the validity period of the gift card has passed and the customer does not fully use (or partially) the value for which he purchased the card, then this amount is an extra profit for the store.
Gift cards are useful for contests in which the prize will prompt the customer to visit the store.
Gift card can contain different elements. Most often it is a magnetic stripe, barcode or/and embossed number. The use of some of the options may require the use of additional devices: computers, readers, printers.
Gift Card does not have to be boring! In our offer there are many kinds of ornaments, like spot uv varnish or glittering background.
Die Cut Gift Cards
Choose from the hundreds of custom shaped die cut gift cards from our current portfolio. If you don’t see a shape you like, we’ll create a custom shape just for you.
Restaurant Gift Cards
Create personalized plastic gift cards that your customers will adore for your restaurant, brewery or hospitality business.
Spa Gift Cards
Allow your customers to give the gift of relaxation with our unique and custom shaped salon and spa gift cards.
Retail Gift Cards
Create a gift card perfect for any retail business with our wide array of sizes, shapes and color choices. You can also choose to add the magnetic bar code to make your gift cards easily scannable or enhance it by making it writable.
Movie Theater Gift Cards
Turn your movie theater gift card into a marketing tool by designing it into the shape of your logo, a movie reel, box of popcorn, 3D glasses or other fun and meaningful shapes to your business.
Custom Gift Card Design and Gift Card Printing
Our graphics team can include your existing logo or create personalized gift cards and a fresh new look branded just for your business!
Variable Data Printing
Different locations with different printing needs, No problem, our team and equipment can tailor to your needs. Talk to us today about how you can leverage customer data to save time and money!
Gift Card Accessories
Gift Card Holders
Gift Card Envelopes
Custom or Generic Backers
Card Display Sleeves
Card Display Stands
View Accessories
Specialty Upgrades
We are leaders in innovative shapes, designs and customization making differentiating yourself from your competition simple, easy and fun.
Die Cut Shapes
Variable Printing
QR Code
Barcode
Magnetic Strips
Encoding
Writable Finish
Premium Upgrades
Our premium features can be added to standard gift card sizes only, but can also include variable printing, encoding barcodes, magnetic stripes, signature panels, adding images or QR codes.
Transparent Satin “Frosted”
Embossed
Foil Stamping
Metal Business Cards
PVC Card Stock
Gift card holders available
In addition the gift card can be attached to the paper welcome letter with guidelines and terms and conditions of use. A nice designed gift card holder will certainly improve you company image in the market. When designing it would be a good idea to leave space on the inside for the giver to write in who is the recipient and the giver, and perhaps an area for a message. Contact us to have our designer create a custom designed plastic gift cards holder for you.
Lite Card
Thickness: 15mil/0.38mm
200pcs
$79.00
Material:Solid white pvc Surface: Glossy/Matte/Metallic 7-Day Free shipping:US/CA/EURO Order
500pcs
$109.00
Material:Solid white pvc Surface: Glossy/Matte/Metallic 7-Day Free shipping:US/CA/EURO Order
1000pcs
$129.00
Material:Solid white pvc Surface: Glossy/Matte/Metallic 7-Day Free shipping:US/CA/EURO Order
2000
$189.00
Material:Solid white pvc Surface: Glossy/Matte/Metallic 7-Day Free shipping:US/CA/EURO Order
Standard Card
Thickness: 30mil/0.76mm
500pcs
$129
Material:Solid white pvc 7-Day Free shipping:US/CA/EURO Order
1000pcs
$149
Material:Solid white pvc 7-Day Free shipping:US/CA/EURO Order
2000pcs
$279
Material:Solid white pvc 7-Day Free shipping:US/CA/EURO Order
5000pcs
$599
Material:Solid white pvc 7-Day Free shipping:US/CA/EURO Order
Source: ColorMax Printing Produces Plastic gift cards and custom gift cards with advanced technology
0 notes
Text
About Async Iterators in Node.js
Async iterators have been around in Node since version 10.0.0, and they seem to be gaining more and more traction in the community lately. In this article, we’ll discuss what Async iterators do and we'll also tackle the question of what they could be used for.
What are Async Iterators
So what are async iterators? They are practically the async versions of the previously available iterators. Async iterators can be used when we don't know the values and the end state we iterate over. Instead, we get promises that eventually resolve to the usual { value: any, done: boolean } object. We also get the for-await-of loop to help us with looping over async iterators. That is just like the for-of loop is for synchronous iterators.
const asyncIterable = [1, 2, 3]; asyncIterable[Symbol.asyncIterator] = async function*() { for (let i = 0; i < asyncIterable.length; i++) { yield { value: asyncIterable[i], done: false } } yield { done: true }; }; (async function() { for await (const part of asyncIterable) { console.log(part); } })();
The for-await-of loop will wait for every promise it receives to resolve before moving on to the next one, as opposed to a regular for-of loop.
Outside of streams, there are not a lot of constructs that support async iteration currently, but the symbol can be added to any iterable manually, as seen here.
Streams as async iterators
Async iterators are very useful when dealing with streams. Readable, writable, duplex, and transform streams all have the asyncIterator symbol out of the box.
async function printFileToConsole(path) { try { const readStream = fs.createReadStream(path, { encoding: 'utf-8' }); for await (const chunk of readStream) { console.log(chunk); } console.log('EOF'); } catch(error) { console.log(error); } }
If you write your code this way, you don't have to listen to the 'data' and 'end' events as you get every chunk by iterating, and the for-await-of loop ends with the stream itself.
Consuming paginated APIs
You can also fetch data from sources that use pagination quite easily using async iteration. To do this, we will also need a way to reconstruct the body of the response from the stream the Node https request method is giving us. We can use an async iterator here as well, as https requests and responses are streams in Node:
const https = require('https'); function homebrewFetch(url) { return new Promise(async (resolve, reject) => { const req = https.get(url, async function(res) { if (res.statusCode >= 400) { return reject(new Error(`HTTP Status: ${res.statusCode}`)); } try { let body = ''; /* Instead of res.on to listen for data on the stream, we can use for-await-of, and append the data chunk to the rest of the response body */ for await (const chunk of res) { body += chunk; } // Handle the case where the response don't have a body if (!body) resolve({}); // We need to parse the body to get the json, as it is a string const result = JSON.parse(body); resolve(result); } catch(error) { reject(error) } }); await req; req.end(); }); }
We are going to make our requests to the Cat API to fetch some cat pictures in batches of 10. We will also include a 7-second delay between the requests and a maximum page number of 5 to avoid overloading the cat API as that would be CATtastrophic.
function fetchCatPics({ limit, page, done }) { return homebrewFetch(`https://api.thecatapi.com/v1/images/search?limit=${limit}&page=${page}&order=DESC`) .then(body => ({ value: body, done })); } function catPics({ limit }) { return { [Symbol.asyncIterator]: async function*() { let currentPage = 0; // Stop after 5 pages while(currentPage < 5) { try { const cats = await fetchCatPics({ currentPage, limit, done: false }); console.log(`Fetched ${limit} cats`); yield cats; currentPage ++; } catch(error) { console.log('There has been an error fetching all the cats!'); console.log(error); } } } }; } (async function() { try { for await (let catPicPage of catPics({ limit: 10 })) { console.log(catPicPage); // Wait for 7 seconds between requests await new Promise(resolve => setTimeout(resolve, 7000)); } } catch(error) { console.log(error); } })()
This way, we automatically get back a pageful of cats every 7 seconds to enjoy.
A more common approach to navigation between pages might be to implement a next and a previous method and expose these as controls:
function actualCatPics({ limit }) { return { [Symbol.asyncIterator]: () => { let page = 0; return { next: function() { page++; return fetchCatPics({ page, limit, done: false }); }, previous: function() { if (page > 0) { page--; return fetchCatPics({ page, limit, done: false }); } return fetchCatPics({ page: 0, limit, done: true }); } } } }; } try { const someCatPics = actualCatPics({ limit: 5 }); const { next, previous } = someCatPics[Symbol.asyncIterator](); next().then(console.log); next().then(console.log); previous().then(console.log); } catch(error) { console.log(error); }
As you can see, async iterators can be quite useful when you have pages of data to fetch or something like infinite scrolling on the UI of your application.
In case you're looking for a battle-tested Node.js team to build your product, or extend your engineering team, be kind and consider RisingStack's services: https://risingstack.com/nodejs-development-consulting-services
These features have been available in browsers for some time as well, in Chrome since version 63, in Firefox since version 57 and in Safari since version 11.1. They are, however, currently unavailable in IE and Edge.
Did you get any new ideas on what you could use async iterators for? Do you already use them in your application?
Let us know in the comments below!
About Async Iterators in Node.js published first on https://koresolpage.tumblr.com/
0 notes
Text
Want To Learn About Desktop Computers? This Short Article Will Educate You On
Is a new computer within your future? Do you have browsed some sites and stores but aren't aware what you should look for? Should you answer yes, then you're not by yourself. Many people feel exactly the same way in relation to desktop computers. Read more to find out some great ideas to help simplify the process. Try to find folks wanting to eradicate their desktops. Plenty of folks have moved toward laptops or tablets, and so want to sell desktops with a bargain price. The majority of these desktops have been in great shape however, look into the computer out before making a proposal. Look into what accessories come packaged together with your desktop and those you need to buy. Some computers have extra accessories offered to purchase. You ought to only get accessories you may use. You should also check around since accessories might be cheaper elsewhere. If you pick them direct from manufacturers or at tech stores, they are normally sold with a premium. Look at the accessories that will include any computer you're considering. There are several accessories that you can buy for your brand new computer. Don't buy any you don't need. Also, you could possibly find your add-ons cheaper on different websites. Those bought from the company tend to be higher priced. Be sure your fan is working as well as the interior is dust free regularly. You can easily get rid of the case and make use of compressed air to blow off any dust that has settled. This will likely prevent dust from entering the pc and can lower its temperature. Search different sites that are reputable for reviews about each computer you are interested in. It can be intimidating to shop for a pc, so benefit from just what the experts ought to say. Make sure you obtain a warranty on any computer you buy. This will protect your investment if any one of the software or some other element was to ruin. A repair or exchange are usually in order when you come upon any problems. To transfer some large videos, you will need a desktop computer with a DVD optical drive that may be writable. A CD drive will not be enough for bigger multimedia files. You will require space that exist coming from a DVD. It would cost much more, and often will be far more convenient down the road. When selecting a pc, take this short article along with you. You won't ought to look very far once and for all tips during the buying experience, which will ensure no salesperson tries to obtain in a bad deal. That may be sure that you remember what you have learned here.
2 notes
·
View notes
Text
VLK | EDU: Spaces for Learning Today and into the Future
Walk into a school building, and you are hit with the sun illuminating the space. You already know this place is special; unique. It feels different. It is bright and open. The sights and sounds are more reminiscent of a coffee shop than a school.
There is movement - a lot of movement. Learners ebb and flow around groups and spaces as they engage in today’s lessons. There is a hum of conversation. It’s not just the teachers talking; around every corner, you can see and hear even the youngest learners talking through their learning; thinking critically; problem-solving; asking questions; being curious.
Gone are the days of students sitting silently in rows while listening to a teacher drone on and on about a topic in a monotone voice. Today, learners have varied the needs and spaces they work in are flexible, allowing them to engage in different activities; they love to have a voice and choice in their activities. They are able and willing to take ownership of their learning as they move through their educational experiences.
Benjamin Franklin said, “Tell me, and I forget. Teach me, and I remember. Involve me, and I learn.” Schools today involve their learners. They inspire learners to explore worlds of information, allowing them to engage the knowledge they digest and make their own. Learners ask questions, create new ideas, and discuss concepts. They reflect on their learning, explore information, and adapt to challenges. They take learning places you never thought they would go. Every day is an adventure.
Using our Typology of Spaces approach, we design spaces that nurture learners inquisitive nature by:
including writable surfaces that allow the learning to happen anywhere
introducing flexible spaces that allow for the space to be adapted based on learner needs
utilizing natural light to keep the space bright and inviting
adding pops of color that are fun and add to the brand of the campus or building
intentionally designing a space that allows collaboration and critical thinking to happen naturally
Our Typology of Spaces includes six different types of learning environments that fit the various needs of our learners today. Each space helps to create an inviting learning atmosphere that is engaging while remaining relaxed and comfortable, allowing the highest levels of learning to take place. THINK spaces are small collaborative areas allowing two-to-three learners to engage with each other on the task at hand. Think spaces can are purposefully placed throughout the buildings, using flexible furniture that allows for efficient use of instructional space. Classrooms are CREATE spaces enabling a whole group of learners to receive instruction, to break into smaller groups to fit the current learning lesson, or to individually work as needed. The furniture in these spaces easily accommodates all groups, allowing for quick transitions and making the most of class time. DISCOVER spaces are spaces specialized for learning within the area. These could include science labs, culinary kitchens, or libraries. The furniture in this space includes large surfaces, allowing learners to do a lot of inquiry-based work. EXCHANGE spaces are inside or outside learning areas that could accommodate large groups of students, such as a grade level at an elementary school or a student organization. These spaces are great for hosting club meetings and sharing presentations as a culmination of learning. DESIGN LAB spaces are professional collaborative work areas for teachers to design engaging activities for their learners, and to take on the role of the learner as they continuously reflect and hone their craft.
The use of transparency in our building designs allows the natural light to filter into the spaces. There are not only windows to the outside, but also in the classrooms from the hallways. These windows allow teachers to see what learners are doing, even when they aren’t in the classroom. It helps to maintain consistent classroom management and to enable students to monitor lesson pacing by being able to keep an eye on the teacher in the classroom. “Architecture can influence a positive culture,” Dr. Dalane Bouillion, VLK Principal of Educational Planning, explains in a previous blog post. “It can serve as an inspirational place for learning.” Our learning spaces must inspire our learners, inviting them into a world full of mystery and excitement.
��Somewhere, something incredible is waiting to be known.”- Carl Sagan
from VLK Architects https://vlkarchitects.com/insights/vlk-edu-spaces-for-learning-today-and-into-the-future
0 notes
Text
New Kitchen Ideas and Trends for 2018
For years, we have called the kitchen the “heart” of the home. With 2018’s influx of new kitchen ideas, products, and smart technology, we might as well call it the “brain” of the home as well. Our kitchen designers are seeing some interesting patterns develop from our Long Island clients’ dream kitchen wish lists. These range from automated smart-kitchen innovations to bold, eccentric design materials. This year’s new products are right in line with what our clients are asking for. Here are nine new kitchen ideas to consider for your home renovation from some of the best kitchen manufacturers and designers in the space.
9 New Kitchen Ideas and Trends for 2018
• 1) BOLD COLOR • 2) EARTHY INSPIRATION • 3) AUTOMATED KITCHEN HOME SYSTEMS • 4) VOICE CONTROLLED FAUCETS • 5) WRITABLE SURFACES • 6) SOUS VIDE COOKING • 7) LED LIGHTING • 8) HIGH TECH DISHWASHING • 9) SPECIALTY SINKS AND FAUCETS
Idea #1: Dark and Bold Colors With Contrast
Whether in accessories, sinks, faucets, appliances, or cabinetry you’ll see dark and bold color in the design. Contrasting colors offset the hard surfaces that can make a kitchen feel cold. We are using hues in green, blue, dark grey, red, and even black. All are stunning when paired with finishes in white, metal, and wood.
Appliance manufacturers such as Smeg and Bluestar have a tremendous array of colors to choose from. Mixing cabinet finishes with contrasting design elements and bold colors will continue to trend in 2018.
New Kitchen Ideas #2: Earthly Inspiration
Some clients are asking for inspiring spaces that are grounded with natural elements. Additionally, they seek balance and simplicity, which can be achieved by mixing natural metal patinas, wood, and textures to create a fresh space that brings the outdoors in. Natural neutral colors (greys, whites, creams, browns) will have a home in new kitchen design. For a truly organic space, try adding a living wall, an Urban Cultivator herb refrigerator, or an in-kitchen composting system like the Blanco Solon.
Urban Cultivator herb refrigerator…
Plant pots of herbs on shelves or in a living wall…
This is the Blanco Solon in counter compost system …
We are all striving to live a little greener, and composting kitchen scraps like egg shells, vegetable peels, juice pulp, or coffee grounds can supply nutrient-rich fertilizer for our gardens and plants while reducing our carbon footprint.
It is fairly easy to incorporate a composting system in the kitchen. You can put a beautiful compost crock directly on your counter, or you can cut a hole into your countertop and install an integrated system that collects all your organic waste below the counter and out of sight. Regardless of the style of bin you choose, the key is placing it in a convenient location in your kitchen so you are more likely to use it. Buy Blanco Solon here. (Disclosure: Kitchen Designs earns commissions from the products featured in this post.)
New Kitchen Ideas #3: Automated Kitchen & Home Systems
Innovations that allow you to wirelessly communicate with your lights, thermostats, music, fans, shades, smart door locks, and security cameras have allowed us to connect with our homes whether we are in them or not. Now you can connect with your oven, coffee maker, small appliances, and refrigerator.
Imagine looking to see what’s missing from your refrigerator while you are at the supermarket. You won’t need to ask “Am I out of eggs” when you can remotely have a peek inside the refrigerator. Crestron, AMX, and Control 4 have some impressive innovations for your kitchen and home that allow you to connect with ease. I’m still waiting for appliances to prepare a meal for me like the Jetsons, however!
New Kitchen Ideas #4: Voice Controlled Kitchen Faucets
Kohler has updated their Sensate Kitchen Faucet. This smart faucet has voice-activated technology that will turn water on and off on command. It will also dispense the exact amount of water you need for a recipe without waste. Use the KOHLER Konnect app to monitor and track your water usage. You can even receive alerts that will detect unusual usage. If your hands are dirty, you can either tell the faucet to turn on with your voice or motion activate it without ever touching faucet handles.
Works with Amazon Alexa, Google Assistant, and Apple Home Kit. See video here…
New Kitchen Ideas #5: Formica® Writable Surfaces
When it comes to new kitchen ideas, Formica added a very fun element. Formica® Writable Surfaces is a collection of extremely smooth writable surfaces for desks, playroom game tables, bedrooms, laundry rooms, and kitchens. The collection includes four markerboard and two chalkboard surfaces that can be used for countertops as well as cabinetry. They also have two stand-alone products that are now being sold on Amazon – SketchTable™ and CreateSlate™
The SketchTable™ SketchTable™ is a game and homework table that comes in two sizes (16” and 24” ). The tabletop flips so you have the option to draw on one side with markers and the other side with chalk. Buy SketchTable™ here.
CreateSlate™ CreateSlate™ is a framed writable surface board that comes in two styles — the LoveWords markerboard or the Black ChalkAble™ chalkboard. Buy CreateSlate™ here.
New Kitchen Ideas #6: Sous Vide Cooking
If you’re into sous vide cooking, have a close look at Gaggenau’s Sous Vide and Vacuum Sealer Oven (Model DV461). Their 400 series combi-steam oven has been upgraded with an option to seamlessly install a matching hands-free vacuuming drawer to seal meat, fish, veggies, fruit, and leftovers. It can also be used for sous vide cooking, marinating, and extended storage. The sous vide method involves vacuum sealing food to lock in moisture and cooking food in a water bath at a specific temperature. There are a variety of countertop sous vide appliances and immersion circulators on the market, but this new design keeps the counters clear integrates beautifully into the kitchen wall. See the videos at Gaggenau.
New Kitchen Ideas #7: LED Lighting
Requests for LED lighting in kitchen design has been a strongly climbing kitchen trend. Our Wood-Mode line has a high-tech integrated lighting program featuring Häfele LED solutions that incorporates the best 12 volt, 3rd generation Loox LEDs for customizable, energy efficient, subtle illumination. Our brand’s lighting program was awarded the 30 Most Innovative products for 2018 by Meridith Corporation’s Beautiful Kitchens and Baths Magazine.
Your kitchen designer can help you select various configurations of LED lighting for your kitchen cabinets. They will help you manage wires, switches, and drivers as part of your installation during the construction phase of your renovation.
At our NY kitchen showroom, the most popular choices among our clients have included: interior cabinet lighting (especially magnificent behind glass doors), drawer and shelf lighting with light bars, open and floating shelves display lighting, toe kick lighting, bath vanity mirror lighting, wardrobe pole lighting, and integrated lighting that switches on and off with sensors when doors open and close. You can also use a programmable remote for switching lighting to your desired ambiance.
New Kitchen Ideas #8: High-Tech Dishwashing
SubZero and Wolf’s new Cove dishwasher has interior illumination and remote mobile user operation. The Cove dishwasher’s high-end details include numerous cycle options and easily adjustable racks with interior basket that can accommodate the size of any utensil, glass, or dish – even a lasagna pan.
New Kitchen Ideas #9: Specialty Sinks in Unique Finishes
Sinks with personality are emerging in stainless, porcelain, and natural materials such as copper, quartz, and granite. In addition to farmhouse and under-mount sinks, you’ll see freestanding sink designs and integrated sinks that are made from the same material as the countertop. Specialty sinks will have a variety custom inserts to help you prep, clean, and organize.
Here are a few new kitchen ideas for 2018 sink styles:
Once you compile all your new kitchen ideas, stop by our Long Island kitchen and bath showroom and make an Appt for a free consult with a designer at Kitchen Designs by Ken Kelly, 26 Hillside Avenue, Williston Park, NY 11596 – 516-746-3435.
Curious for More?
Receive monthly updates to see what's new on the blog and get lifestyle design ideas from our team. It's fun...Join us!
You have successfully subscribed to Kitchen Designs newsletter!
Source: https://www.kitchendesigns.com/new-kitchen-ideas-trends/
0 notes
Text
Tips And Tricks Through The Experts On Desktop Computers
Like many, you probably use computers on a daily basis. They may not last as long as you feel, though. Try to get cheap deals on them if you purchase. Read this article for ideas on doing just that. Read on for several great desktop-buying tips. Find people who would like to give their desktop away. Most people are moving up to tablets and laptops, meaning they want to eradicate their desktop in a good price. The computers are usually good, but make sure that it's working well. Perform a boot check in the event that your computer is running slower than it must. MS Config might be run in the first place menu. Out of this menu, look at what programs start if the machine boots. Do not start up programs which can be unnecessary. This helps your laptop or computer accelerate. Try to get the desktop computer that you could afford with only the features you need. Lots of people look for machines with too many expensive add-ons that they can really never use. Know precisely what exactly you need and save money buy purchasing a computer that only delivers these. Take measurements to the sport the place you will place your desktop. Desktops have large variations in proportions. Most will be tiny although some will probably be huge. You need to know whatever you can truly easily fit in the area you possess. Get exact measurements in the location the place you plan to put your desktop. Desktop computers have varying sizes based off of the brands and models that they are. Don't get a desktop which is too large to fit within its designated space. Know what you can go with the location that you are currently considering. In order to buy a Mac but you also want to run PC programs, take into consideration getting Parallels for Mac. This is a software application that essentially lets you use a PC's operating system right on the Mac. That lets you run any software suitable for a PC. You will need to purchase the operating-system for PC too, though. Create a long list of all the stuff you will do on your new computer. The level of computer you will require depends upon the method that you use it. When you just do an everyday email check, you may need a different computer from somebody that does hard core gaming. Should you need a computer to transfer big video files, you might need a DVD optical drive that may be writable included with your desktop. A CD drive will not be enough for bigger multimedia files. You will likely have to have the space of any DVD drive. It's one more cost, but it'll help you save trouble later. Have this article handy when you are out to purchase a desktop computer. You may reread each tip while you peruse your choices. You will get a whole lot with a great computer using this type of information.
0 notes
Text
IC ch2 - The Origins of National Consciousness
Tuesday, December 12, 2017
In "The Origins of National Consciousness", Anderson discusses the many converging factors leading to the decline of religio-sociologically-centralized Europe, and the rise of mentalities vulnerable to nationalism. Latin became more esoteric as its use declined during the periods of Reformation and Enlightenment. This allowed local vernaculars to flourish in common vocal usage, and then to be neutered as they were appropriated in print by administrative powers, stripping them of their essential regional vestigialities (40, 41). Anderson calls this a fatalism of diverse language, though this terminology may put a more negative tone on the era than is useful. For, as print technology became widespread, print itself became a connective tool, shaving off the impediments to communication bred by colloquial language differences, unifying, in their imaginations, hundreds of thousands of people, while at the same time cementing the notion that there were "others" outside of this community (44). These administrative languages also competed with Latin, "contributing to the decline of the imagined community of Christendom" (42). The mentality of the common populace began to change in deep and unselfconscious ways, as Anderson hints when he opens the chapter by mentioning Benjamin's theory of a revolution into an Age of Mechanical Reproduction, in which the aura, and thus essential cult power, of art is lost, and replaced with symbols and ideas dispersed en mass, perhaps aiding in the Enlightenment (how are these linked?) (37).
Alongside this loss of faith in the truth of the Word and the development of administrative languages, "print-capitalism created languages-of-power," with which rulers could impose authority via (mis)information or impediments to daily life without the use of the language-of-power, a power-accumulation tactic easily packaged and replicated by rulers elsewhere (45). Beyond the intentional use of language to accumulate power, language was a major factor in the profitability of print. As Latin became more arcane, its market became more quickly saturated, and thus printers focused more on local vernaculars. Typically, this took the form of small volumes, cheaply bought, leading to the self-perpetuating mass-consumption of books, cementing their commodification, making the slight suggestions contained within their pages all the more powerful as they were consumed universally and continually. Luther may have been the leading author behind print's mass-consumption, as his works were the first "best sellers." However, this led to the "battle for men's minds" through religious propaganda (40). Print was tied to capitalism in Western Europe, and was controlled by "wealthy capitalists," which created shifts in the class framework, and which helped begin the monetization of power and influence (38).
Another result of print's widespread consumption was that it "gave a new fixity to language," allowing for the reading of history in a way impossible in the preceding centuries, when language evolved rapidly enough to make documents (and thus history) inaccessible (44,45). Anderson began this conversation when he touched on the conception of simultaneity in the last chapter, and this point will doubtless be useful in his discussion of the paradox between nationalists and historians in the appraisal of the age of their nation. If the general population has a source from which to draw an idea of historical incidence, then the expansion of a concept beyond given documentation wouldn't be difficult to a mind with any inkling of imagination. If history is writable, it is rewritable.
0 notes
Text
RWDevCon 2017 Inspiration Talk: Silver Bullets and Hype by Roy Marmelstein
Note from Ray: At our recent RWDevCon tutorial conference, in addition to hands-on tutorials, we also had a number of “inspiration talks” – non-technical talks with the goal of giving you a new idea or some battle-won advice, and leaving you excited and energized.
We recorded these talks so that you can enjoy them even if you didn’t get to attend the conference. Here’s one of the inspiration talks from RWDevCon 2017: “Silver Bullets and Hype” by Roy Marmelstein. I hope you enjoy it!
Transcript
Today we’ll be talking about silver bullets … like this one:
The idea of the silver bullet has its origins in horror movies and horror literature; it’s the one tool that can kill all of the monsters. It can kill the werewolves and the vampires.
When we talk about silver bullets in programming, we’re talking about the same kind of idea. Basically, it’s the quest to find a tool that can make our code better and bug-free and easy to maintain. I think this is something that we all do at some point in our career.
The story should be familiar to most of you. You go online, you read a blog post that really inspires you, or you attend a conference that’s really cool and you go to all of these tutorials. Maybe one idea really grabs you and you get super excited.
via GIPHY
And on Monday you go back to your office, you talk to your team, you open up X-code and the first thing you want to do is implement this exciting new idea.
And you do that, and over time you start realizing that maybe the idea was not the perfect solution for your app, and that your app has some special needs and special constraints.
As time passes, basically you just create a technical debt for yourself. This ends in tears and you need to reflect and refactor. Basically, this keeps happening and I think it happens to senior developers and junior developers alike, so it’s not really a question of experience and I kept wondering why.
Why Do We Seek Out Silver Bullets?
The answer I came up with is that it’s about hype.
We got into programming because we love technology, we want to make things and we want to live in the future, so it’s natural that we get excited by new solutions.
When I’m talking about hype, I’m also talking about a trademarked idea of hype.
Gothno, which is a research firm, has this Hype curve, which looks like this:
It’s used for investors to assess the life cycle of technology adoption. All technologies start with a trigger, they reach a peak of inflated expectations, a trough of disillusionment and eventually a plateau of productivity. These are all great titles for them.
If you think about desktop computing, that’s somewhere deep in the plateau of productivity; with self-driving cars, we’re in the peak of inflated expectations.
I think this really applies to iOS technologies and frameworks and ideas as well. Basically, the point of this talk is trying to understand through experience and through shared stories how we can accelerate getting to that plateau of productivity because that journey is quite painful and expensive.
I’m going to talk about three trendy ideas in Swift. I will explain a bit about what they are, though I know you’ve had a busy two days, so I’ll keep it very brief. Then I’ll try and see what we can learn from them.
Protocol-Oriented Programming
First up, we’re going to talk about protocol-oriented programming. In the past year I’ve attended a lot of conferences around Europe. Pretty much every one of them had a talk about protocol-oriented programming.
The buzz words when we talk about these are composition over inheritance.
As someone who grew up playing Legos, composition is something that we like, and the idea that we can build our classes by just putting things together is cool as a developer.
It got popular after this talk at WWC in 2015 where we met Crusty.
Apple really sold it as something that will really change the way we write code forever; it’s a great new paradigm for writing Swift code.
It’s often explained with cars, but I wanted to do a quick RW Devcon explanation, so I’ll explain it with tutorials.
You don’t need to look very deeply into this code, but if we wanted to do the object-oriented presentation of a tutorial, we can have a tutorial class with a couple of tutorial functions like writing and presenting and publishing. We have all the talks that we’ve had in this conference, like Mastering Git, and all the sub-classing, and the big tutorial superclass.
The problem arises when you have books and we just want to pick the right function from the tutorial. Basically, object-oriented programming gets very monolithic, and it’s very hard to pick and mix functions and use them as we want.
The big idea of protocol-oriented programming is that we just can break this down into protocols.
With Swift, we can have default implementations of things and then we just describe the tutorials as something that’s writable or publishable and presentable.
Then we can present books very easily, something that’s writable, an inspiration talk is something that’s presentable, we can even mock things up in a really nice way. This is super cool and it’s very useful but … it’s not a silver bullet.
Why? I think basically it comes down to this idea that if you have a hammer, everything looks like a nail.
Especially if it’s a hammer that you really enjoy using, you just want to use it again and again. I think we just went a bit overboard with this as a community.
You’ll see a lot of popular blog posts saying that if you’re sub-classing you’re doing it wrong. On Github, if you search for “protocol-oriented”, there are almost 180 repositories and that this is one of their selling points in Swift that it’s protocol-oriented.
Implementing the Trend
Okay, it’s story time:
After attending my first conference, there were so many exciting ideas. There was this one live-coding talk, which is a bit like a tutorial but less interactive, so you just watch someone do the work.
What they did is just subtract away all the boiler plate that goes around writing a table view with adapters and with protocols and ended up with a very small bit of code to change the contents of the table view. I thought this was brilliant, I was so excited about it.
I came back to work on Monday and I really wanted to implement this. At that time, we were working on a new view controller with a complex collection view, complex layout with updating logic. I thought, “Yeah, let’s do this”.
On the surface, this was quite a successful experiment. We we’re able to copy that approach very quickly, ended up with very short code and it was really good … but flaws revealed themselves over time.
When new people joined the company, no one really wanted to touch this class because they had no idea what was going on there. I couldn’t give them a good reason for why it did this apart from the fact that it’s cool. That became a bit of a problem. And I noticed was that the collection view was not reusable. I wasn’t trying to get composition and reusability out of it; I was just using it because I wanted to.
This is something I noticed that a lot of people do with protocol: just using it to represent data that, again, you don’t want to compose or reuse, and this doesn’t give you that much over just using normal value types.
Lessons
The blue-background images will be technical learnings about the technologies, and then there’s a green screen afterwards about learnings in general.
The technical lesson here is that protocol-oriented programming is fantastic for composable, reusable behavior and not so fantastic for describing data and doing things that you don’t want to compose and reuse.
The big learning in general here is that you should really start simple.
Don’t approach a problem because you have this new technology that you’re excited about, but actually try and solve it in the most simple way and then see if there’s actually a need for this. Really consider what you get out of using a technology; just the fact that it’s cool is not really a valid justification for using something.
Functional Reactive Programming
The second thing I want to talk about is functional reactive programming.
We have two very popular frameworks: RX Swift and Reactive Cocoa. The buzz word here is that it’s declarative and everything is a stream.
The dream that is sold to developers is that your app will just become a function of state. Again, this is something that’s very seductive, we like to look at our apps and it’s this complex set of cogs and machines moving each other and the idea of an app is just a river of operations and chains. It’s something that’s quite nice and you think you’ll have a less buggy app if you do this.
When it works, it works quite well. Again, you don’t really need to look at this code, but doing e-mail validation with about two lines of code is something that will take a lot more to do in Swift, and so RX Swift can be quite nice.
Another one of the selling points is that RX Swift is part of a family of RX frameworks; you have RX Java as well, and in theory, you could do the same kind of manipulations of events on Android and get some kind of a cross-platform benefit.
Again, this is not really a silver bullet.
More Story Time
Another story: in the previous company, a CTO got super excited about RX and we had the problem where we had an Android app and it was hard to keep both apps on par and having the same behavior. And we had a few bugs that were very hard to replicate.
The idea was that we would rewrite both apps and try to use RX pretty much from the data layer all the way to the UI and get something good out of that. Through the experience, you realize that there’s quite a steep learning curve to using RX.
The basic concepts are easy to understand, but it takes time to really get good at it. Generally, talking to other developers, I think the consensus that it takes months until you start to think of problems as RX problems and problems of streams and events. There are about 25,000 results for functional programming on Amazon.
A lot of people find this quite difficult, I guess. Then, once you do master it, you end up with code like this.
Again, what you need to see here is that it’s quite complicated to read.
It’s not only that you need to learn it and your team needs to learn it but everyone who will join your team will need to take this time to understand RX and will take time before they become useful members of your team.
Another issue we had was with UIKit.
UIKit is not really on board with this vision of declarative interface. RX has some tools: we have RX Cocoa and RX Status Source and they try to massage it into being declarative. But every once in a while, UIKit will have its little revenge.
Generally, if you’re doing complex things with UI, it can become quite difficult and UIKit is a big source of pain.
If you’ve ever used KVO, you’ll know this problem where you have these streams in your app and if you get a crash report, it’s really hard to see what actually happened and it’s really hard to debug.
With RX it’s like that on steroids. This image is rendered by a tool for RX Java but I think it’s like a calculator app and it’s just showing you all of the streams that are happening.
Even though there are tools for helping with that journey development, again, if you get stack traces from people out in the wild, it’s really hard to see what’s going on. Basically, for that experience, the result was that it was really hard. We didn’t get the parity with Android, that didn’t happen. We ended up with more bugs that were quite hard to debug and we had to do a bit of refactoring, scale back the level of RX usage in the app and stop slightly further down the stack.
It was a great learning experience but it wasn’t a great experience from the product perspective and from the development perspective.
Lessons
My learnings here were that RX is really good for simplifying data flows, but if you have a lot of complexity in your app, especially in UI and you expect to onboard a lot of new people, it could become an issue.
The general lesson here is that, even if you’re in a position to do it, you should not force new technologies on your teams, especially not with a broad mission like rewriting the entire app.
Again, most things are not silver bullets, they have limitations; in order to discover the limitations, it’s better to start with smaller tasks.
I think it’s really important to consider the future debt. I have a friend who worked on an app that went all-in on Reactive Cocoa and then changed their minds; it took them the best part of a year to remove Reactive Cocoa. So really try to consider if that’s something you’re willing to do before choosing to adopt a new technology.
Swift
The last thing that I want to talk about, which I guess is controversial for this kind of talk, is Swift itself.
Everyone loves Swift, I love Swift. It was sold to us as Objective-C without the C, which is pretty cool.
These are just a few of the things that I love about Swift; we have protocols and syntax and type safety, value types, it’s functional-ish. It’s a great language, it’s really fun to write, but I don’t think Swift is a silver bullet. I think it depends on your needs and the app that you’re working on.
Even in 2017, I don’t think every app needs to be written in Swift, and Objective-C still has a place.
Story Time: My Swift Experience
I have quite a few Swift frameworks, and I got to work on two apps that use a lot of Swift, both full Swift and transitioning from Objective-C to Swift. I encountered a lot of issues in that process.
The main one was compile times:
This dictionary could take eight hours to compile. Type inference is a big problem in Swift. You won’t really get this in your real app, but the worst it got for us was it took about eight minutes to compile the app. It really kills motivation when every time you run it you have to go and make some coffee. That was a big problem for us.
Another issue is basically having to do this ceremony each year where you have to convert your app every time there’s a new version of Swift. It’s nice that we have a converter, but it’s really not perfect. For me, the big issue is that the unit test broke as well. You can’t just trust that it works.
Talking to other developers, on average it seems to take about two to three weeks to update. If you can’t really afford to do that, maybe that’s something that you should consider.
The last issue is modularity, so if you’re working on an app with a lot of other people, it’s nice to keep things modular and break things into framework. I think there was a workshop about that as well. In Swift, that’s a bit difficult because we don’t have static libraries so everything needs to be dynamically linked.
Apple says we should have a maximum of six, which is not very realistic. What you end up with is quite significant launch times and that could be a problem as well.
Lessons
My experience was that for new projects, especially for open source Swift, it’s amazing. It’s great to learn Swift and it’s really fun.
I think it’s important to learn Swift now because you can influence the development and influence the future of iOS and the server, but consider the constraints of your project.
If you have a project with a lot of legacy code and a lot of contributors and dependencies, I would personally hold back with Swift right now.
The wider learning is to fully consider the constraints of your projects, even if it goes against the common beliefs in the industry right now.
Wrapping Up
The message of this talk is not that you should not try new things, and I think that should be quite explicit. We’re actually very fortunate to work in an industry where we get all these new ideas and we have all these new technologies coming in to solve our problems. We should stay excited and we should try them, but the message is more that:
We shouldn’t follow trends just because they’re trends.
We should not approach a new technology as a potential silver bullet.
We should know that all technologies have limitations.
We should try to figure out these limitations as quickly as possible and to focus on solving the specific problem in the app.
I know this is quite easy to say but quite hard to do, so I have some suggestions about how to do that.
I think, generally, hackdays and hackathons and side projects are great ways to experiment without the depth of doing it in your main app.
For me, though, the real solution is more of these story times. As a community, we’re very good at sharing excitement about new things and sharing success stories, but we’re not so good about sharing lessons from mistakes and sharing limitations of things. We need to get better at that, because we get excited about the same things and the more we learn from each other, the shorter that time to the plateau of productivity is.
I want to end on a really positive note and keep the excitement going: we should still stay excited about things and stay hungry and stay foolish.
Just try to have a slightly more critical appreciation of things as you’re excited about them. Thank you!
Note from Ray: If you enjoyed this talk, you should join us at the next RWDevCon! We’ve sold out in previous years, so don’t miss your chance.
The post RWDevCon 2017 Inspiration Talk: Silver Bullets and Hype by Roy Marmelstein appeared first on Ray Wenderlich.
RWDevCon 2017 Inspiration Talk: Silver Bullets and Hype by Roy Marmelstein published first on http://ift.tt/2fA8nUr
0 notes
Text
From Discs to Digital: The Odd History of Music Formats
From Discs to Digital: The Odd History of Music Formats: via LANDR Blog
Physical formats have only been around since the 1870s— But in that relatively short amount of time, we’ve managed to come up with some pretty bizarre ways to release music.
Each format on this list had its moment of usefulness. But looking back might make you ask “what were we thinking?”
Regardless of how obsolete certain formats might be, they all led us to where we are today: streaming.
Most music fans choose to purchase their music digitally—either via download or streaming.
Smart artists are following suit as well. Many musicians are skipping out entirely on the cost of releasing physical formats—opting instead for digital music distribution that fits today’s music landscape.
Well, how did we get here?
Regardless of their popularity today, every format on this list played its part in the march towards digital domination.
We strolled through the odd history of music formats to explore where that journey has taken us—and where it might lead…
Here’s the music format timeline—from vinyl to digital and everything in between:
1948: The Record
Records, or discs, of varying speeds and materials have actually been around since the early 1900s—early versions rotated at 78 RPM (vroom, vroom!) and were made of shellac, which made them noisy (the bad kind of noisy, not the good kind) and fragile.
In 1948, Columbia Records produced a 33 RPM 12-inch ‘long play’ format, which we know, love, and donate the thrift stores today as the LP.
The first LP ever pressed was titled Columbia ML4001, and was a “Mendelssohn Violin Concerto in E Minor” by violinist Milstein with the New York Philharmonic Symphony Orchestra, conducted by Bruno Walter.
Shortly after, RCA Records developed a 45 RPM 7-inch ‘extended-play single’ format, or the EP for short.
Because of the fragility of shellac, which was frequently broken during transport, both Columbia and RCA Records eventually began producing their LP and EPs on vinyl.
Because of the fragility of shellac, which was frequently broken during transport, both Columbia and RCA Records eventually began producing their LP and EPs on vinyl.
Size and portability were the biggest strikes against vinyl. Eventually the music industry sought to find a solution and developed new formats that people could easily bring with them to work, parties, etc.
Despite the numerous physical formats that have been created since vinyl records, the market for them is still strong: according to the mid-year 2017 Discogs report the most popular physical music format sold so far this year is vinyl, with a year-to-year increase in sales of 13.92%.
But despite vinyl’s sustained popularity over time, vinyl was set aside as the go-to format as listeners looked for the next best thing.
1963: Compact Cassette
Compact Cassettes, or tapes, were invented by the Philips company and introduced to Europe at the Berlin Radio Show—Europe’s oldest tech convention with a rich history of its own.
Early cassettes featured reverse housing with a max play time of 45 minutes of stereo audio per side—significantly longer than a vinyl LP’s playtime.
Tapes also fit in a more affordable, compact package. The small size of tapes gave rise to portable players, making them a convenient development in the history of how and where we listen.
The cassette also fit perfectly into the post-war era. A boom in population and suburban expansion meant cars… lots of cars. So the need for mobile playback systems and formats was a hot concept.
The invention of tapes also introduced a volatile new concept into recorded music: Piracy.
The advent of cassettes and cassette recorders caused record companies to predict devastating effects on the music industry. After unsuccessful attempts to tax blank tapes, the DAT (digital audio tape) Bill was introduced in 1989, which restricted the amount of tapes consumers could by and prevented them from making copies of copies, aka the SCMS system.
Cassettes also birthed mixtape culture—a concept that runs the music industry as we know it today.
However, it didn’t help record labels, who believed that a tax should be paid to them. In 1991 the Audio Home Recording Act was introduced, which collected tax from media and record makers and distributed it back to labels.
But it wasn’t all suitcases, court cases and taxes on tapes… Cassettes also birthed Mixtape culture, giving amateur compilation creators a way to record audio off of multiple records and compile a single playlist—a concept that runs the music industry as we know it today.
These days tapes certainly aren’t our main mode of listening, but the industry is still active—In 2016, cassette sales grew by 74% from the previous year.
1964: 8-Track Tape
The 8-track tape was a collaborative invention between the unlikely trio of RCA Records, Lear Jet Company, and Ampex Magnetic Tape Company. This may seem like somewhat of an odd group, but Bill Lear of Lear Jet Corporation, along with his employee Richard Kraus, were responsible for designing the cartridge for 8-track tapes.
Lear, who manufactured private, luxury aircrafts, had an interest in audio and previously tried to create an endless-loop wire recorder in the 1940s.
The benefit of 8-track tapes over the compact cassette was their ability to house 8-parallel soundtracks with four corresponding stereo programs—they could play a lot of music in a relatively small package.
Much of the 8-tracks success is thanks to the booming automobile industry of the time. By 1966, Ford Motors offered 8-track players as an option in their complete line of automobiles produced that year.
At home players were introduced the following year, and many saw the 8-track as a solution to the portability issue of records and record players.
Despite 8-track’s popularity in the 60s and 70s, the compact cassette took over as the more popular choice for artists and consumers due to its favourable size and price-tag.
Despite their popularity in the 60s and 70s, the compact cassette took over as the more popular choice for artists and consumers due to its favourable size and price-tag. As a result the 8-track became largely obsolete then, and today.
It’s argued that the last 8-track tape ever released by a major label was Fleetwood Mac’s Greatest Hits, released in November of 1988 by Warner Records—perhaps a sign that we’d never be going back to 8-tracks again?
1972: Floppy Disk
Floppy Disk’s are normally associated with data storage for desktop computers, but during the 80s and 90s a select-few artists began releasing albums on this somewhat unconventional format.
IBM introduced the 8-inch floppy disk to the tech world in 1972, which was followed with a 5 ¼-inch model in 1976, and finally replaced with a conveniently-sized a 3½-inch format in 1982.
The floppy release remained fairly niche and never truly hit the mainstream. Diskette’s most notable release was Brian Eno’s 1996 album Generative Music I, released through Opal Music.
There were also a handful of major releases on diskette that tried to bring a “multimedia” angle to albums, but the format simply never caught on.
Regardless of diskette’s ill-fated moment in music, the floppy represents an important foreshadowing of music’s digital future—a trend that would soon be taken up by the CD explosion…
1982: Compact Disc
The floppy represents an important foreshadowing of music’s digital future.
In 1974, Philips (yes, the same Philips of tape fame) had the initial idea for CDs as a replacement for records and cassettes. During the same time Sony was also working on their own prototype (CD wars!). Sony’s offering was first demoed in 1976.
Eventually the two companies came together and CDs were officially launched as a viable format in 1982. Sony also introduced the first ever CD player that year, the CDP-101 Compact Disc Player which cost $1000!
With CDs also came portable CD players, CD-ROM drives, writable CDs and the 16-bit/44.1kHz benchmark for audio formats, which all had their own effect on how we listen to music.
CDs also brought together the best of every format that came before it: High-quality audio, compact, portable, writable and inexpensive.
Overall the CD was an extremely important development for the music industry, becoming the defacto release format for decades.
It was a need the CD and Discman could only fill for so long.
But in many ways the CD was the beginning of the end for physical formats. Computers and the MP3 (more on this in a minute) quickly took over our listening habits. With the invention of the internet and as computers became more sophisticated, so did the constant demand for convenience. It was a need the CD and Discman could only fill for so long.
As soon as it became possible to access music through your computer or MP3 player, most people no longer wanted to have physical copies of music when they could store everything in a folder on their desktop.
Of course CDs didn’t just evaporate overnight. There’s still some Discpeople out there. Even though Discog’s 2017 mid-year report cites vinyl as the physical format showing the biggest growth, CDs have seen their own growth in sales with an increase of 23.23% on the used market.
1992: MP3
The MP3 was originally developed in the the early 80s by researcher Karlheinz Brandenburg. His post-doctoral work at the AT&T Bell Labs expanded on pre-existing codecs for compressing audio. In a strange twist, Brandenburg chose Suzanne Vega’s 1987 hit “Tom’s Diner” as a test song to perfect the MP3.
But it wasn’t until 1992 that the MP3 went mainstream, and not until 1999—with the creation of Napster—that the format really caught fire.
The music industry is just now starting to recover from it’s own digital dawn…
Napster allowed for free peer-to-peer file sharing of the MP3 audio file that resulted in widespread copyright infringement and understandable outrage from the music industry.
Despite its brief 3 year run in its initial form, Napster eventually paved the way for platforms like the iTunes store—allowing users to search, purchase, and instantly play music all with a few clicks.
The effects from the shockwave that the MP3, piracy and pure digital formats created are still being felt today. In many ways, the music industry is just now starting to recover from it’s own digital dawn…
2002: Streaming
With 24/7 internet accessibility expanding thanks to mobile, developers and entrepreneurs saw the opportunity for something big: The possibility of listening to, and discovering, new music without having to actually download files or purchase songs.
Additionally, streaming platforms aimed to (hopefully) make digital music a sustainable business model for everyone involved. In many ways it has, but there’s still a long way to go.
The release of the iPhone in 2007 is what really caused streaming and internet radios popularity to skyrocket. Apps that were previously desktop only, were now available at the palm of your hand.
The following year Spotify launched, which runs off of paid advertisements. Users have two choices: listen for free with ads, or pay a monthly fee for unlimited, uninterrupted streaming.
Streaming apps filled the creeping demand for non-physical access to music and ushered in our current chapter of formats: Dematerialized music.
If this list proves anything, it’s that nothing is forever—especially in music.
For better or worse, every music format played it’s roll in the march towards streaming. While streaming hasn’t made every other format obsolete, there’s no denying that it’s the format that’s leading how we access music. For now at least…
What Now?
If this list proves anything, it’s that nothing is forever—especially in music.
So what’s next on the horizon? Maybe we’ll all listen to music while our autonomous cars drive us around? Or is there a renaissance for video’s role in music on the horizon (I’m looking at you Hype)?
No matter what the future holds, the format that matters most is the one that your favourite artist is releasing on.
If you’re a fan, support the artist and buy their music on the formats they distribute music with. Do your research and find the format that fits you, and them, the best!
We can all do our part to help the music industry by paying for music—and the formats it’s released on.
Support independent musicians, support small labels and support your local record store!
The post From Discs to Digital: The Odd History of Music Formats appeared first on LANDR Blog.
from LANDR Blog https://blog.landr.com/music-formats-history/ via https://www.youtube.com/user/corporatethief/playlists from Steve Hart https://stevehartcom.tumblr.com/post/165981480674
0 notes
Text
The Computer Tips In this Post Is Priceless
Ideas about Desktop PCs that are Hard to Discover Need to Know this Tips about Computers Latest Information about Computers and Laptops Methods to Take your Desktop PC to the Top I am sure that you use a personal computer at work or your residence.On the other hand computers do not last a lifetime. It is worth it to know how to get a excellent deal when searching for that desktop. That's why this article was made. This advice will assist you with buying a nice desktop. Anti-virus software is vital for your computer. If you don't have this program, you might find a virus. You do not need to get a desktop computer with applications that makes the computer run slowly. There are plenty of programs that repair any problems based on and can run scan. You should clean the dust out on a weekly basis to ensure your computer is as efficient as you can. The outer casing is usually easily removed, and you can spray on other product or the dust. This maintain your computer clean and cool and keeps the buff properly. If you would like to purchase a new Mac, but use PC software, Parallels for Mac can help. This software allows you to run an instance of a PC system live in your Mac. You'll have the ability to run whatever PC app you want to. Remember the PC systems have to be purchased separately. If you wish to get a gaming pc and like playing games online, you need to keep in mind some things. Your system needs a video card, no less than 4 GB in memory and also a higher resolution display. You may buy exceptional keyboards and controllers to boost your play. If you perform to transfer any large video files A DVD drive that's writable is a necessity. A CD drive may be insufficient for multimedia files that are larger. If that's true, it is a must to step up it to your DVD drive. It costs more but saves you money and trouble in the long run. Whenever you are looking for a PC, ensure that all software is lawful. You have to receive the key along with the CD so that you don't incur difficulties or find yourself unable to get software updates that are essential. Don't worry about cost drops. Before buying a computer people watch for deals. A number of them don't buy one because they're waiting on a bargain. There is not much of a price difference between deals that are good, so you should act quickly once you find a good one. Do not cheap out on your next computer. In most instances, you get exactly what you purchase for. Purchase from the Internet or computer stores and go with brands that are reputable. Any price that appears too good is. Some sellers might have what look like great deals, but the machines may be not be sold may require repairs that are or as either advertised. You can not get an original warranty with a desktop that is usedcomputer. Any computer businesses will make it possible for a guarantee. If you decide there is a computer purchase for you, do this without thinking about the guarantee as part of this deal or you may be let down. Check out costs at a number of shops. Computers aren't affordable. If you look for them you will find great deals. Be aware of the hardware of a computer. It is always vital that you get both good value together with functionality that is good. Buying a combo of a desktop, printer, and track was the typical means to earn your personal purchase. Do not! Computer monitors are becoming increasingly hard to find because a lot of flat-screen televisions may serve this function too. Do not forget that if you're satisfied with your current mouse and keyboard, there's no need to get new ones. There are two choices of hard drives, after that there are several possibilities. While a new version is represented by the SDD, HDD drives are the standard for the majority of people. The SSD drive is the better option but also cheaper. You must be prepared for shopping for your computer. While you go shopping, use them. Adhere to the advice found above to help find a computer that fits your requirements.
0 notes
Text
Can't Figure Out Desktop Computers? Read This!
You might initially be excited when buying a brand new desktop computer. Your excitement may turn to fear once you are looking at all the choices. How can you going to get a machine that will suit you well? The following tips can help you find out what you need to know to make the best choice.
You should have an anti-virus program installed on your computer. You can easily find that a virus if you don't have this software. This software can find your personal information. There are quite a few applications that will run scheduled checkups to make sure your desktop.
Find people who is getting rid of their desktop. Many more people decide to purchase a laptop and will sell their desktop at a very reasonable price. These computers are usually in fine shape, but before you make an offer, make sure it works okay.
A boot check is important if your desktop operating slowly. Run the "ms config" from the "start" menu. This program lets you which programs are being opened at start up. Find applications that you do not use a lot, and set them to not start on boot up. This will make your system run faster.
When building your own desktop computer at home pay attention to the types of products you use. Some motherboards work only be compatible with particular processors. Some RAM units only work with a particular motherboards. Make sure all the products are cross compatible. This can save a considerable amount of time and headaches when you build your own computer.
Look at technology sites for reviews before purchasing a good idea on what you should buy. It is overwhelming when you look at the choices, however, things will be easier.
To begin, make sure that the computer is equipped with a high-quality video card, a display with high resolution and a minimum memory of 4 GB. You may want to also need special controllers and keyboards to boost the experience.
The equipment you need will depend on what applications you perform on a regular basis. Gamers need different options on a computer than those who are just browsing.
If you will be storing a lot of substantial-sized videos on your desktop computer, you probably want a writable drive. A typical CD drive may not do enough for bigger multimedia files. You will need the extra space that DVD drives provide. It will cost a bit more, but it ends up saving money and headache.
The computing world has seen a lot of changes in recent years, and a desktop computer is now cheaper than a lot of laptops. You can often find a desktop for around $400 in many computer stores. Just make sure that wherever you are purchasing your new computer from has a solid reputation.
Mini PCs are a great green benefit of using less electricity. They operate with less electricity usage, but have the processing power you need. This is a great choice if you do little more than read and send emails, basic document creation and shopping.
Do not miss out on your dream computer because you're waiting for the price drops significantly. Some individuals only pay attention to what deals when they are the best. But they don't do it anything about it thinking they will find a better deal is around the corner. Typically, the difference between great deals will be very slim, the price difference in the deals will be minimal, so you should grab one soon after you find a deal that's right for you
Keep the peripherals in mind when desktop shopping. You are going to want speakers, a keyboard, speakers and mouse. You may also want to get a printer useful and you will most likely need an Internet modem. Which other hardware could you require?
Don't go too cheap when it comes to a pc purchase. You generally get the quality you paid for. Choose brands you know about and stores that have good reputations.
Most manufacturers won't transfer warranties to different owners.
In past years, buying a combo of a desktop, monitor, keyboard and mouse in one box. You may be able to use your flat-screen TV instead. Keep in mind that your old keyboard and mouse will work.
Read customer reviews about buying. You should never just blindly buy a computer can do. Many times you will find that cheap computers come with a variety of problems.
Think about what you use a computer. Make a list of the things you think you'll be using it for. Make this list as comprehensive as possible.
Do you want a specific operating system you would like? If you are a fan of Windows 7, then that doesn't necessarily mean you will like Windows 8, too.
Don't underestimate the importance of build quality when you are selecting a new computer. You need to be sure that can take some abuse. If the case feels flimsy or cheaply made, consider buying something better.
Look at what software your computer comes with. Don't assume you will have a word processor or other commonly used products. Lots of computers only include free trial versions of popular programs. This will make the computer cheaper, but buying the software separately will usually end up costing you even more.
It isn't easy to buy a new computer, but a bit of knowledge goes a long way. Relax, and start putting this new information to work for you. This will help your upcoming shopping trip end up with you getting the perfect machine.
0 notes
Text
Informative Tips On Finding A Great Desktop Computer For A Great Deal
Buying a new computer is an exciting event. However, these excited feelings can quickly turn to confusion and fear. What can you do to ensure you buy the right machine? The following tips can help you learn what you need to know about buying a desktop computer. Do a boot check if your computer is slow. You will be able to find this information in the start menu. This program lets you view the different programs that automatically start up when your computer does. If you see any programs in the given list that are not ones you need, disable them. This should make your system run faster. Be wary of the types of products you use when you are building your very own desktop computer. Some processors will only work with certain motherboards. Also, make sure that you get the appropriate RAM unit. When buying parts, check out the compatibility factor. This will save you a lot of time, money, and headaches when building your own desktop computer. To make sure your computer is most efficient and to make sure your fan is properly cooling the components, dust the inside of the computer every week. The external casing is usually easily removed, and then you can simply spray the dust away using a compressed air dispenser or other product made especially for this purpose. This way, the computer will remain clean, and the fan can operate properly. Check out review sites to learn all you can. This will give you a much better idea of what computer fits your needs. It is vital that on your next desktop computer purchase it comes with a warranty. This helps if something messes up on your computer. Generally, you'll be able to get repairs done, or replace the entire computer if necessary. There are certain types of computers for gamers. Your system needs a solid video card, no less than 4 GB in memory and a higher resolution display. Also, you can purchase special controllers and keyboards to boost your play. To find the desktop for you, write down what tasks you wish to perform on it. The kind of computer you will need depends on how you use it. If you are a gamer, your requirements will be different from a user who just checks email and shops online. In order to transfer big video files, be sure a desktop has a DVD optical drive that is writable. CD drives do not have the capacity to store larger media files. You will need the extra space that DVD media provides. Determine Cloud Ninjas Reviews whether you need or want the extra space that a DVD optical drive provides over standard CD drives in order to assess whether it makes sense for you to incur the extra cost in purchasing a desktop computer with this feature. Make sure any software you buy with a computer is legal. You need to be given both the CD's and keys for all software installed to ensure you can reinstall it if you must. Things have changed in the world of computers, and now a ready-made desktop is typically less expensive than many laptops. You can often find a reliable desktop computer for around 400 dollars at many computer stores. Confirm that the vendor is reliable before you purchase. If you want an energy-saving alternative, consider a mini PC. They operate with less electricity usage, and normally have enough power to get many tasks accomplished. If you primarily use a computer for Internet and office use, then a mini PC might be right for you. Shop around for your next desktop. They can actually be quite expensive. That said, there are many deals available if you know what you want. Make sure you know about its hardware. You should figure out what computer gets you the best bang for your buck. Be aware of the two common hard drive types when you are making choices for a new computer. Most people have an HDD hard drive in their systems, but there is a newer type called the SSD. The SSD is not going to store as much information and it does cost more, but it is much higher tech and does provide more rpm than that standard drives. It can be tough to make a computer purchase, but sound tips can really help. Relax, and start putting this new information to work for you. Doing so will prompt the proper computer purchasing experience for you.
0 notes
Text
Getting Started with Ansible a.k.a. how to Automate your Infrastructure
After going through this tutorial, you’ll understand the basics of Ansible - an open-source software provisioning, configuration management, and application-deployment tool.
First, we’ll discuss the Infrastructure as Code concept, and we’ll also take a thorough look at the currently available IaC tool landscape. Then, we’ll dive deep into what is Ansible, how it works, and what are the best practices for its installation and configuration.
You’ll also learn how to automate your infrastructure with Ansible in an easy way.
Table of contents:
Understanding the Infrastructure as a Code concept
Why was Ansible created?
What is Ansible?
How to install Ansible
Ansible setup, configuration and automation
Creating an Ansible playbook
Understanding Ansible modules
Running our Ansible playbook
What to use Ansible for
Okay, let's start with understanding the IaC Concept!
What is Infrastructure as Code?
Since the dawn of complex Linux server architectures, the way of configuring servers was either by using the command line, or by using bash scripts. However, the problem with bash scripts is that they are quite difficult to read, but more importantly, using bash scripts is a completely imperative way.
When relying on bash scripts, implementation details or small differences between machine states can break the configuration process. There’s also the question of what happens if someone SSH-s into the server, configures something through the command line, then later someone would try to run a script, expecting the old state.
The script might run successfully, simply break, or things could completely go haywire. No one can tell.
To alleviate the pain caused by the drawbacks of defining our server configurations by bash scripts, we needed a declarative way to apply idempotent changes to the servers’ state, meaning that it does not matter how many times we run our script, it should always result in reaching the exact same expected state.
This is the idea behind the Infrastructure as Code (IaC) concept: handling the state of infrastructure through idempotent changes, defined with an easily readable, domain-specific language.
What are these declarative approaches?
First, Puppet was born, then came Chef. Both of them were responses to the widespread adoption of using clusters of virtual machines that need to be configured together.
Both Puppet and Chef follow the so-called “pull-based” method of configuration management. This means that you define the configuration - using their respective domain-specific language- which is stored on a server. When new machines are spun up, they need to have a configured client that pulls the configuration definitions from the server and applies it to itself.
Using their domain-specific language was definitely clearer and more self-documenting than writing bash scripts. It is also convenient that they apply the desired configuration automatically after spinning up the machines.
However, one could argue that the need for a preconfigured client makes them a bit clumsy. Also, the configuration of these clients is still quite complex, and if the master node which stores the configurations is down, all we can do is to fall back to the old command line / bash script method if we need to quickly update our servers.
To avoid a single point of failure, Ansible was created.
Ansible, like Puppet and Chef, sports a declarative, domain-specific language, but in contrast to them, Ansible follows a “push-based” method. That means that as long as you have Python installed, and you have an SSH server running on the hosts you wish to configure, you can run Ansible with no problem. We can safely say that expecting SSH connectivity from a server is definitely not inconceivable.
Long story short, Ansible gives you a way to push your declarative configuration to your machines.
Later came SaltStack. It also follows the push-based approach, but it comes with a lot of added features, and with it, a lot of added complexity both usage, and maintenance-wise.
Thus, while Ansible is definitely not the most powerful of the four most common solutions, it is hands down the easiest to get started with, and it should be sufficient to cover 99% of conceivable use-cases.
If you’re just getting started in the world of IaC, Ansible should be your starting point, so let’s stick with it for now.
Other IaC tools you should know about
While the above mentioned four (Pupper, Chef, Salt, Ansible) handles the configuration of individual machines in bulk, there are other IaC tools that can be used in conjunction with them. Let’s quickly list them for the sake of completeness, and so that you don’t get lost in the landscape.
Vagrant: It has been around for quite a while. Contrary to Puppet, Chef, Ansible, and Salt, Vagrant gives you a way to create blueprints of virtual machines. This also means that you can only create VMs using Vagrant, but you cannot modify them. So it can be a useful companion to your favorite configuration manager, to either set up their client, or SSH server, to get them started.
Terraform: Vagrant comes handy before you can use Ansible, if you maintain your own fleet of VMs. If you’re in the cloud, Terraform can be used to declaratively provision VMs, setup networks, or basically anything you can handle with the UI, API, or CLI of your favorite cloud provider. Feature support may vary, depending on the actual provider, and they mostly come with their own IaC solutions as well, but if you prefer not to be locked in to a platform, Terraform might be the best solution to go with.
Kubernetes: Container orchestration systems are considered Infrastructure as Code, as especially with Kubernetes, you have control over the internal network, containers, a lot of aspects of the actual machines, basically it’s more like an OS on it’s own right than anything. However, it requires you to have a running cluster of VMs with Kubernetes installed and configured.
All in all, you can use either Vagrant or Terraform to lay the groundwork for your fleet of VMs, then use Ansible, Puppet, Chef or Salt to handle their configuration continuously. Finally, Kubernetes can give you a way to orchestrate your services on them.
Are you looking for expert help with infrastructure related issues or project? Check out our DevOps and Infrastructure related services, or reach out to us at [email protected].
We’ve previously written a lot about Kubernetes, so this time we’ll take one step and take a look at our favorite remote configuration management tool:
What is Ansible?
Let’s take apart what we already know:
Ansible is a push-based IaC, providing a user-friendly domain-specific language so you can define your desired architecture in a declarative way.
Being push-based means that Ansible uses SSH for communicating between the machine that runs Ansible and the machines the configuration is being applied to.
The machines we wish to configure using Ansible are called managed nodes or hosts. In Ansible’s terminology, the list of hosts is called an inventory.
The machine that reads the definition files and runs Ansible to push the configuration to the hosts is called a control node.
How to Install Ansible
It is enough to install Ansible only on one machine, the control node.
Control node requirements are the following:
Python 2 (version 2.7) or Python 3 (versions 3.5 and higher) installed
Windows is not supported as a control node, but you can set it up on Windows 10 using WSL
Managed nodes also need Python to be installed.
RHEL and CentOS
sudo yum install ansible
Debian based distros and WSL
sudo apt update sudo apt install software-properties-common sudo apt-add-repository --yes --update ppa:ansible/ansible sudo apt install ansible
MacOS
The preferred way to install Ansible on a Mac is via pip.
pip install --user ansible
Run the following command to verify the installation:
ansible --version
Ansible Setup, Configuration, and Automation
For the purposes of this tutorial, we’ll set up a Raspberry Pi with Ansible, so even if the SD card gets corrupted, we can quickly set it up again and continue working with it.
Flash image (Raspbian)
Login with default credentials (pi/raspberry)
Change default password
Set up passwordless SSH
Install packages you want to use
With Ansible, we can automate the process.
Let’s say we have a couple of Raspberry Pis, and after installing the operating system on them, we need the following packages to be installed on all devices:
vim
wget
curl
htop
We could install these packages one by one on every device, but that would be tedious. Let Ansible do the job instead.
First, we’ll need to create a project folder.
mkdir bootstrap-raspberry && cd bootstrap-raspberry
We need a config file and a hosts file. Let’s create them.
touch ansible.cfg touch hosts // file extension not needed
Ansible can be configured using a config file named ansible.cfg. You can find an example with all the options here.
Security risk: if you load ansible.cfg from a world-writable folder, another user could place their own config file there and run malicious code. More about that here.
The lookup order of the configuration file will be searched for in the following order:
ANSIBLE_CONFIG (environment variable if set)
ansible.cfg (in the current directory)
~/.ansible.cfg (in the home directory)
/etc/ansible/ansible.cfg
So if we have an ANSIBLE_CONFIG environment variable, Ansible will ignore all the other files(2., 3., 4.). On the other hand, if we don’t specify a config file, /etc/ansible/ansible.cfg will be used.
Now we’ll use a very simple config file with contents below:
[defaults] inventory = hosts host_key_checking = False
Here we tell Ansible that we use our hosts file as an inventory and to not check host keys. Ansible has host key checking enabled by default. If a host is reinstalled and has a different key in the known_hosts file, this will result in an error message until corrected. If a host is not initially in known_hosts this will result in prompting for confirmation interactively which is not favorable if you want to automate your processes.
Now let’s open up the hosts file:
[raspberries] 192.168.0.74 192.168.0.75 192.168.0.76 [raspberries:vars] ansible_connection=ssh ansible_user=pi ansible_ssh_pass=raspberry
We list the IP address of the Raspberry Pis under the [raspberries] block and then assign variables to them.
ansible_connection: Connection type to the host. Defaults to ssh. See other connection types here
ansible_user: The user name to use when connecting to the host
ansible_ssh_password: The password to use to authenticate to the host
Creating an Ansible Playbook
Now we’re done with the configuration of Ansible. We can start setting up the tasks we would like to automate. Ansible calls the list of these tasks “playbooks”.
In our case, we want to:
Change the default password,
Add our SSH public key to authorized_keys,
Install a few packages.
Meaning, we’ll have 3 tasks in our playbook that we’ll call pi-setup.yml.
By default, Ansible will attempt to run a playbook on all hosts in parallel, but the tasks in the playbook are run serially, one after another.
Let’s take a look at our pi-setup.yml as an example:
- hosts: all become: 'yes' vars: user: - name: "pi" password: "secret" ssh_key: "ssh-rsa …" packages: - vim - wget - curl - htop tasks: - name: Change password for default user user: name: '""' password: '""' state: present loop: - '""' - name: Add SSH public key authorized_key: user: '""' key: '""' loop: - '""' - name: Ensure a list of packages installed apt: name: '""' state: present - name: All done! debug: msg: Packages have been successfully installed
Tearing down our Ansible Playbook Example
Let’s tear down this playbook.
- hosts: all become: 'yes' vars: user: - name: "pi" password: "secret" ssh_key: "ssh-rsa …" packages: - vim - wget - curl - htop tasks: [ … ]
This part defines fields that are related to the whole playbook:
hosts: all: Here we tell Ansible to execute this playbook on all hosts defined in our hostfile.
become: yes: Execute commands as sudo user. Ansible uses privilege escalation systems to execute tasks with root privileges or with another user’s permissions. This lets you become another user, hence the name.
vars: User defined variables. Once you’ve defined variables, you can use them in your playbooks using the Jinja2 templating system.There are other sources vars can come from, such as variables discovered from the system. These variables are called facts.
tasks: List of commands we want to execute
Let’s take another look at the first task we defined earlier without addressing the user modules’ details. Don’t fret if it’s the first time you hear the word “module” in relation to Ansible, we’ll discuss them in detail later.
tasks: - name: Change password for default user user: name: '""' password: '""' state: present loop: - '""'
name: Short description of the task making our playbook self-documenting.
user: The module the task at hand configures and runs. Each module is an object encapsulating a desired state. These modules can control system resources, services, files or basically anything. For example, the documentation for the user module can be found here. It is used for managing user accounts and user attributes.
loop: Loop over variables. If you want to repeat a task multiple times with different inputs, loops come in handy. Let’s say we have 100 users defined as variables and we’d like to register them. With loops, we don’t have to run the playbook 100 times, just once.
Understanding the Ansible User Module
Zooming in on the user module:
user: name: '""' password: '""' state: present loop: - '""'
Ansible comes with a number of modules, and each module encapsulates logic for a specific task/service. The user module above defines a user and its password. It doesn’t matter if it has to be created or if it’s already present and only its password needs to be changed, Ansible will handle it for us.
Note that Ansible will only accept hashed passwords, so either you provide pre-hashed characters or - as above - use a hashing filter.
Are you looking for expert help with infrastructure related issues or project? Check out our DevOps and Infrastructure related services, or reach out to us at [email protected].
For the sake of simplicity, we stored our user’s password in our example playbook, but you should never store passwords in playbooks directly. Instead, you can use variable flags when running the playbook from CLI or use a password store such as Ansible Vault or the 1Password module .
Most modules expose a state parameter, and it is best practice to explicitly define it when it’s possible. State defines whether the module should make something present (add, start, execute) or absent (remove, stop, purge). Eg. create or remove a user, or start / stop / delete a Docker container.
Notice that the user module will be called at each iteration of the loop, passing in the current value of the user variable . The loop is not part of the module, it’s on the outer indentation level, meaning it’s task-related.
The Authorized Keys Module
The authorized_keys module adds or removes SSH authorized keys for a particular user’s account, thus enabling passwordless SSH connection.
- name: Add SSH public key authorized_key: user: '""' key: '""'
The task above will take the specified key and adds it to the specified user’s ~/.ssh/authorized_keys file, just as you would either by hand, or using ssh-copy-id.
The Apt module
We need a new vars block for the packages to be installed.
vars: packages: - vim - wget - curl - htop tasks: - name: Ensure a list of packages installed apt: name: '""' state: present
The apt module manages apt packages (such as for Debian/Ubuntu). The name field can take a list of packages to be installed. Here, we define a variable to store the list of desired packages to keep the task cleaner, and this also gives us the ability to overwrite the package list with command-line arguments if we feel necessary when we apply the playbook, without editing the actual playbook.
The state field is set to be present, meaning that Ansible should install the package if it’s missing, or skip it, if it’s already present. In other words, it ensures that the package is present. It could be also set to absent (ensure that it’s not there), latest (ensure that it’s there and it’s the latest version, build-deps (ensure that it’s build dependencies are present), or fixed (attempt to correct a system with broken dependencies in place).
Let’s run our Ansible Playbook
Just to reiterate, here is the whole playbook together:
- hosts: all become: 'yes' vars: user: - name: "pi" password: "secret" ssh_key: "ssh-rsa …" packages: - vim - wget - curl - htop tasks: - name: Change password for default user user: name: '""' password: '""' state: present loop: - '""' - name: Add SSH public key authorized_key: user: '""' key: '""' loop: - '""' - name: Ensure a list of packages installed apt: name: '""' state: present - name: All done! debug: msg: Packages have been successfully installed
Now we’re ready to run the playbook:
ansible-playbook pi-setup.yml
Or we can run it with overwriting the config file:
$ ANSIBLE_HOST_KEY_CHECKING=False $ ansible-playbook - i “192.168.0.74, 192.168.0.75” ansible_user=john ansible_ssh_pass=johnspassword” -e ‘{“user”: [{ “name”: “pi”, “password”: “raspberry”, “state”: “present” }] }’ -e '{"packages":["curl","wget"]}' pi-setup.yml
The command-line flags used in the snippet above are:
-i (inventory): specifies the inventory. It can either be a comma-separated list as above, or an inventory file.
-e (or --extra-vars): variables can be added or overridden through this flag. In our case we are overwriting the configuration laid out in our hosts file (ansible_user, ansible_ssh_pass) and the variables user and packages that we have previously set up in our playbook.
What to use Ansible for
Of course, Ansible is not used solely for setting up home-made servers.
Ansible is used to manage VM fleets in bulk, making sure that each newly created VM has the same configuration as the others. It also makes it easy to change the configuration of the whole fleet together by applying a change to just one playbook.
But Ansible can be used for a plethora of other tasks as well. If you have just a single server running in a cloud provider, you can define its configuration in a way that others can read and use easily. You can also define maintenance playbooks as well, such as creating new users and adding the SSH key of new employees to the server, so they can log into the machine as well.
Or you can use AWX or Ansible Tower to create a GUI based Linux server management system that provides a similar experience to what Windows Servers provide.
Stay tuned and subscribe to our newsletter! You can find the subscribe box in the left column, on the top of the article.
Next time, we’ll dive deeper into an enterprise use case of Ansible with AWX.
Getting Started with Ansible a.k.a. how to Automate your Infrastructure published first on https://koresolpage.tumblr.com/
0 notes
Text
From the Archives: The Nintendo Niino and Super Smash Bros. N
I have a rather large folder of ideas I'd come up with, liked enough to bother recording, but didn't or couldn't actually pursue. Some of them are worth looking at again, particularly when intervening events cast it in a new light. (And if this post is received well, it might become a regular feature.)
In this case, it's an idea for a Nintendo console, which turned out to be surprisingly similar to the Switch, even though, with a timestamp of September 27, 2014, it predates any public knowledge of the Switch, and possibly even predates the Switch development project. While I rewrote it into a more readable format, I have not changed any of the actual details - preserving the state of the idea as it was when I conceived it. It's not a detailed plan, more an executive-summary proposal.
2014 was a lackluster year for Nintendo - sales were plummeting, the Wii U had fizzled, and the New 3DS was flailing. Even fans could tell the company was struggling.
I made a key insight, though I don't claim it was a unique one: the gaming console market is under pressure from two directions. Going back to the 90s, the cheaper handheld consoles served as the entry point for new gamers, with home consoles offering a superior experience at higher cost, and gaming PCs being yet more expensive. But the rise of smartphones has devoured the low end - an iPhone or Galaxy or Nexus is an objectively worse gaming platform than any handheld console, but since they're essentially free as gaming devices (since consumers will be buying a smartphone anyways for communication), they offer the cheapest entry point. The rise of cheap gaming PC hardware (itself an effect of slowing desktop sales) has pressured the high end, driving home consoles to lower price points. This prediction turned out to be fairly true, although I also thought "nanoconsoles" like the Ouya would contribute to the demise of the two-tier console market, which completely failed to happen.
The logical conclusion was that handheld and home consoles will need to merge or displace each other. Sony simply gave up on their portable line, focusing their in-house developers on the home console. Nintendo could easily have given up on their home console line, throw everything onto the 3DS or its successor, and hope to compete with Sony and Microsoft on the merits of portability and game design rather than technical specifications.
But that's only the obvious way to do things. There are problems with that approach - Nintendo has a much longer history than Sony of maintaining both form factors, and there are many Nintendo fans who would be angered if Nintendo chose to abandon one or the other. A non-obvious solution is needed - and a non-obvious solution is what we got, with the Switch. But let's take a look at the idea I came up with, because it took a third approach.
The console I came up with was to be named the "Niino", punning off "Nintendo", "Nano", and the double-i pattern used in the Wii, Mii and Amiibo. The name is kind of dorky, I'll admit, but it's better than "Wii" and that thing sold like hotcakes.
The core principle was that you couldn't make a handheld that worked as a home console as well as a dedicated home console would, and vice versa. The two need to be fully software-compatible, but even just making an ergonomic controller fights against being able to put it in your pocket. So I didn't go as far as Nintendo ultimately chose to - I still had different hardware for home and handheld use. But they were to be different SKUs of the same console, not separate consoles - all games would run on both, down to using the same cartridges.
I called the two versions the "Niino Home" and "Niino Pocket". I specified that both were to use the same architecture. An eight-core ARM-64 CPU was specified - either K12 or Denver, depending on whether AMD or Nvidia was offering a better deal. I further specified a fully-unified memory architecture, with 16GB of GDDR6 memory. (I will note that I also wrote that it would be released in 2020, so my use of stuff that doesn't even exist three years later isn't completely groundless). Games would be stored on internal flash memory (512GB on the Home, 128GB on the Pocket), or on removable cartridges. As a minor twist, the cartridges would be partially-writable - patches would get downloaded and stored, and your game saves would be on the cartridge itself, in a special R/W memory segment.
The Niino Pocket was spec'd with a 4.5" 1920x1080 screen, featuring capacitive touchscreen capabilities (aka multitouch). The Niino Home was specified to target 3840x2160 output resolution (fed over Mini-DisplayPort or HDMI), with an expectation that it would normally downscale to 1920x1080 (getting some free antialiasing in the process). To give the Home enough processing horsepower to render the same game at quadruple resolution, I gave it twice as many GPU compute units, running at twice the clockspeed, and bumped the CPU clocks up by 50% (while keeping the CPU core count the same). That math checks out if the CPU on the Pocket spends 25% of its time on controlling the GPU, and game logic does not scale with resolution which are generally reasonable assumptions.
For controls, I didn't do much unusual. Two analog sticks - full thumbsticks on the Home controllers, smaller PSP-style "nubs" on the Pocket. A D-Pad. Four main face buttons, presumably A/B/X/Y. Four secondary face buttons, two per side - Plus, Minus, Home and Share. I had two analog triggers, with a digital "click" at full travel, like the Gamecube did, as well as two bumpers. And to finish it off, I listed dual accelerometers - they aren't all that useful, but they're so cheap now, why not include them?
Games would be required to run well on both the Pocket and Home, as part of certification. Thanks to the identical architecture, OS, and even screen aspect ratio, the programming to support both would be minimal. Assets could be authored for the Home, and downscaled for the Pocket, or authored for the Pocket and simply reused, not taking advantage of the extra power. Some UI elements might need to be redesigned to work better on the Niino Pocket's physically smaller screen, or more particles might spawn on the more powerful Niino Home, but I was aiming for it to be easier than developing a smartphone app that runs well on multiple screen sizes. I even mandated that the cartridges themselves be the same - if you for some reason buy both a Home and a Pocket, you would still only buy the game once, and play it on both. With savefiles being stored on the cartridge itself, that could be a very useful way to play - almost as seamless as the Switch ended up being, although significantly more expensive if you want that capability.
That unified architecture would allow Nintendo to stop splitting their development resources across two consoles, which would in turn allow Nintendo to develop a more robust library of first-party games. As Nintendo would offer the only viable handheld console, it would make them a more attractive target for third-party devs.
But Nintendo consoles sell because of Nintendo games, so that's what I had plenty of. I listed two pack-in titles, four additional launch games, another four titles within the first year, and five in the second year, as well as as many third-party titles as possible. My third-parties list is amusing from today's viewpoint - I correctly predicted that there would still be annual Call of Duty, Assassin's Creed, and Monster Hunter titles, but I seriously missed the mark by listing "Sonic Boom 3".
The Niino would have two pack-in titles. Nintendo Sports was supposed to be a deeper follow-up to Wii Sports, adding campaign modes, character customization and even some map-making (on the Golf game). But that was really just there for the sake of having a "complete" pack-in game. In retrospect, this was a horrible idea - it would either miss entirely what made Wii Sports a system-seller, or would require compatibility with Wiimotes, which makes it a horrible way to show off the new system.
The other pack-in game was "Super Smash Bros. N", which was a free-to-play Smash... kind of. The pack-in version would include only a minimum of characters, items, and levels (I listed Mario, Link, Donkey Kong and Pikachu, along with a Mii-based custom fighter). But any Niino game could add more - first-party or third-party. That game's developers would be responsible for all the assets and initial programming, although balance patches would come from Nintendo's Smash team, and certification would make sure it worked right and at least came close to being balanced.
So when you bought the obligatory Mario launch title, you automatically get (for example) Peach and Luigi, along with a stage and some items, added to Smash. Spla2oon (I am still surprised that's not what Nintendo's actually calling it) would add Inkling. Metroid: Paralysis would add Samus (don't ask for details on the games themselves, I was just making up titles and one-sentence concepts). Call of Duty would presumably add one of the Captains Price. Even Virtual Console games could get in on it - some people would totally spend another $15 to buy Final Fantasy VI again if it gave you a Terra assist trophy (I think full playable characters are too much to expect for a VC game).
As additional ways to get that content (it is kind of scummy to lock it behind a game you may not want, if you're just a hardcore Smash player), they could also be sold separately, as normal DLC, or bundled with an Amiibo, assuming those continue to sell.
I really, really like this idea. It solves two problems with the Smash series - first, it makes it possible for characters from games released after Smash to appear in the current version, instead of waiting for the next console, and second, it allows the game to eventually have the kind of mammoth size that made Brawl such a wonder. I remember the Brawl spoiler season - the hype was unimaginable. Just when you thought they were done, they dropped more on you. And it will even have more good effects - it acts as a portal to game discovery, and gives a bump to the Niino version of multiplatform games. If you're playing Smash, either at a party or online, and you encounter a character you've never seen before, that might spark an interest in the game they came from. And if you're a hardcore shooter player looking at which new console to buy, maybe the free Smash characters would be enough to tip you towards the Nintendo platform.
Would the Niino have been a better console than the Switch? Maybe. It probably wouldn't have sold as well, because the messaging would be more complicated. And is one do-it-all console better than two single-purpose consoles that do their job better? That would depend on how much better the specialized hardware works, which could only be determined by actually building the things.
Would Super Smash Bros. N have worked well as a system-seller? I think it would, although it would require a permanent support team at Nintendo, something they don't seem to do. It might anger the hardcore Smash fans at first, but it would make for an overall larger game (for free!), and half of them are still playing Melee anyways.
What are your thoughts?
0 notes