Tumgik
zigotica · 9 years
Text
rsync backup into Synology NAS
I love my Synology. Specially the Download and Audio station apps. SickRage is awesome as well. But when in comes to the main purpose of a NAS, making backups is F.R.U.S.T.R.A.T.I.N.G.
Time Machine backups work pretty well… until they fail and you're told to start anew. Which is about once every 3 or 4 months. No, it's not a quota issue. Not a permissions issue either. It stops working without reason and OS X tells you to create a new backup from scratch. This makes any backup system unreliable.
DS Cloud official app is another solution, but as far as I know it only accepts one source folder (ala Dropbox). This lacks any kind of flexibility.
rsync to the rescue! I feel dirty owning a Synology and having to resort to rsync to make proper backups, but…
Setting up Synology for network backups
Basically, we will be opening network connections to a specific user for a specific volume in the NAS.
Enable network backup service
Tick the Enable Network backup service box, found in Main Menu > Backup and Replication > Backup Services > Network Backup Destination. This action creates a NetBackup shared folder.
rsync user
Create a rsync user and configure the following three tabs:
User groups: rsync should be in the System default group.
Permissions: rsync should be granted just for read-write access to the NetBackup shared folder.
Applications: rsync should be allowed only for the NetWork Backup Destination.
Fixed IP
To make things easier, disable DHCP in Control Panel > Network > Network Interface. This IP will be used inside the bash script described below.
Setting up OS X to rsync data to NAS
Now we have the NAS ready we will setup a rsync for each of the source folders we want to backup, and then make the system run that again on a daily basis.
List of folders to be backed up
Create a file that will be your list of folders to be backed up. The file is named after your user name in the system. To get that name, just open a terminal and write id -un. If the username was username the file would be username.txt. Its contents can be:
delete,/Users/username/.atom delete,/Users/username/.bash_profile delete,/Users/username/.config delete,/Users/username/.gitconfig keep,/Users/username/Documents/projects delete,/Users/username/Documents/Dropbox
As you see, each line has two parts, a keyword and a path. The keyword is either delete or keep. This lets me configure if I want to rsync --delete-during option or not for that specific path.
Please note: do not use ~/ in paths since bash will take / as a base and you would end up with something like /Users/username/~/….
And yes, paths here can have spaces, but please do not use spaces between keyword and path.
Password
In order to make it easier to automate (more on that later), we will be saving the rsync password you previously set up in the NAS into a file. We could have set it in an environment var called RSYNC_PASSWORD in your .bash_profile but that was causing problems when running the script from crontab.
So, we create a file, for instance in /Users/username/.config/.rsyncpwd and write the password there. Then set the correct permissions with chmod 600 /Users/username/.config/.rsyncpwd.
Bash script to rsync to NAS
Create a file with .sh extension. For instance, I have placed this file in /Users/username/.config/rsyncnet_backup.sh.
Now the script magic:
#!/bin/bash # Take user name from system so the script won't overwrite data # at destination for other users using the same script. USERNAME=`id -un` # save script folder to a var FOLDER=/Users/username/.config/ # path to password-file PSWRD=/Users/$USERNAME/.config/.rsyncpwd # save destination to a var [email protected].___::NetBackup/$USERNAME/ # reads a file named after your username # each line of the file represents a flag,path OIFS="$IFS"; IFS=$'\n'; lines=($(> $FOLDER/backup.log zync() { for i in "${lines[@]}" do : # each line represents a deletable flag and path to be backed up # do not use ~/ in paths since bash will take / as a base OIFS="$IFS"; IFS=$','; line=($i); IFS="$OIFS" what=${line[0]} SOURCE=${line[1]} if [ "$what" == "delete" ]; then DEL="--delete-during" else DEL="" fi echo echo echo ---------------------------------------------------- echo Syncing $SOURCE STARTTIME=$(date +%s) caffeinate -s rsync --password-file=$PSWRD -a --stats "$SOURCE" $DEST $DEL ENDTIME=$(date +%s) echo "Elapsed $(($ENDTIME - $STARTTIME)) seconds" echo done } zync ${lines[@]} echo END `date` >> $FOLDER/backup.log
The markdown here converts code very strangely, sorry.
The script is setup to log basic stats and elapsed times for each process and write the start/end date to a log file.
I recommend testing this with small folders and the --dry-run option at the beginning, then setup the correct folders once everything is working as you wish.
Scheduling the backups
Once you have run the backup script at least once, and your folders are copied into the NAS, you can schedule it for easier automation. Remember no backup system is a real backup system if you have to do it manually.
We can use iCal, launchd/plist, or crontab. Supposedly, crontab is deprecated and not recommended in OS X, in favor of launchd or running the script from iCal. I really had serious problems running the script from launchd (maybe permissions or env variables not properly set), and iCal Open file custom alert is completely borked when calendars are shared through iCloud (your iPhone cannot run bash). If there was a way to have local and iCloud calendars, this would be a no brainer though.
crontab
So I chose the crontab way.
First we will create a ~/.crontab file and include something like:
00 03 * * * /Users/username/.config/rsyncnet_backup.sh
Obviously, you need to adjust the time (the first two numbers, minute and hour) and path to your bash script.
Now we will tell cron daemon to get this file:
crontab ~/.crontab crontab -l
The last command should return you the contents of the crontab file you created, otherwise (if there was any problem) it will return you something like crontab: no crontab for username.
Your mac is now ready to run your bash script every day at 03:00… or maybe not just yet. Keep reading.
Wake system from sleep
If your system goes to sleep after some time of inactivity, and it is not active at the time the task is programmed, the task will not be run. Thus, we need to schedule the wake up of the system just 2 minutes before that time. Open System Preferences and under Energy Saver options you just schedule
That's it now, it-just-works™.
You have successfully setup a proper backup system to your Synology NAS.
1 note · View note
zigotica · 9 years
Text
Pretty print JSON data
Quick trick to pretty print JSON data received from an API endpoint.
We use cURL to et the content and python to format it. Into terminal:
curl http://www.endpoint.com/resource.json | python -mjson.tool
You can also send it to a file:
curl http://www.endpoint.com/resource.json | python -mjson.tool > pretty.json
1 note · View note
zigotica · 9 years
Text
Git diff and patch
In the past months I've been silently updating my old post Git not-so-used commands. Today I am adding a new trick I learnt this week (ht ozke) to make same changes to be applied to different files much easier.
Change whatever you need to change in one of the files.
For instance we modify forms/a.js and commit changes.
Create the patch file using git diff
git diff from-commit to-commit > forms.diff
If we modified several files and we just want the diff for one of them we can add the option as in:
git diff from-commit to-commit file.ext > forms.diff
Use the following linux command to apply the patch to the other file:
patch forms/b.js forms.diff
0 notes
zigotica · 10 years
Text
Git not-so-used commands
I know there are thousands of posts about Git basics, tricks and whatever, but I always end up using terminal aliases for common tasks (commit, push, pull, status, checkout) or going to a physical notebook to see less used commands. I have them, but hardly go check my bookmarks (or google). So I decided to move that tricks sheet here in the internets. Nothing new, just for convenience.
Create a branch and checkout in one step
git checkout -b branch_name
Push all local branches to origin
git push --all
Force upload of branch to remote as master
git push -u origin branc_name:master
Do not ever do that to your team members. I needed it sometimes for my own server before I had this hook to save branches into different folders
Revert a commit
git revert commit_id
Change commit message (before push)
git commit --amend
Download a branch from remote
git fetch origin git checkout branch_name
Remove a branch from remote
git push origin --delete branch_name
Create and push tag with long description
git tag tag_name -a #opens editor window to write desc git push --tags
Remove remote tag
git tag -d tag_name git push origin :refs/tags/tag_name
Create branch from a previous commit
git checkout -b branch_name commit_id
Create branch from unstaged changes
git stash git checkout -b branch_name git stash apply
Pack last n commits into one
git rebase -i HEAD~n
This opens editor where you can select commits you want to maintain as separate (p) or squash into (s), and even use a different commit message.
Remove files from last commit (before pushing)
Imagine you use alias to add all files and commit at same time, and you did not want to use all changed files in your last commit. To remove those files you need to:
git reset --soft HEAD~1 git reset HEAD files/you/dont/want/to/commit
Find commit where bug was introduced
This literally saved my ass many times. We first need to know a past commit_id where feature was working. Then:
git bisect start HEAD goodcommit_id
This goes in cycles, detaching commit in half the process between that good commit and present, so we can test if that is good or bad. When a commit is detached we test and let bisect know if that specific commit is good or bad:
git bisect good
or
git bisect bad
This cycles until you find the causing commit. Finally, you reset to go back to HEAD:
git bisect reset
Search for a string through past commits
git grep --heading -n "string" $(git rev-list --all)
If you want a list of files having a string in content, that list including past commits and showing line numbers.
See commits that changed a file
git log --follow path/to/file
These include deletion. So, it can be used to see which commit deleted a file, by adding another argument (-1: limits the result to one commit):
git log -1 --follow path/to/file
List commits in branch B not part of branch A
git log B ^A
For instance, you want to know which commits belong exclusively to a branch (those commits were not merged into the other branch yet)
Squash all commits in a branch
git reset $(git commit-tree HEAD^{tree} -m "msg")
That actually is cleaner and faster than interactive rebase, not to mention you avoid having to mark individual commits as squash or pick. It basically creates a commit from a detached HEAD and resets to that commit.
Cherry pick several commits
git cherry-pick A B C
If you have many consecutive commits you can also use this, where A is the first and B is the last commit
git cherry-pick A^..B
checkout all branches after a clone
git branch -a | grep -v HEAD | perl -ne 'chomp($_); s|^\*?\s*||; if (m|(.+)/(.+)| && not $d{$2}) {print qq(git branch --track $2 $1/$2\n)} else {$d{$_}=1}' | csh -xfs
6 notes · View notes
zigotica · 10 years
Text
Git post-receive hook to save branches into different folders
Saving a repository content into a server public folder is quite easy with a post-receive hook, see this gist for an example.
But what if we want to show demos to clients and at the same time, be able to push other branches to the same repository and serve them to hidden folders? For instance, you want one public folder for master (even rename that to /current) and then one for each development branch.
I created this for my gitlab setup:
#!/bin/sh # modified by zigotica from # http://www.ekynoxe.com/git-post-receive-for-multiple-remote-branches-and-work-trees/ # way more flexible now! while read oldrev newrev ref do branch=`echo $ref | cut -d/ -f3` if [ "master" = "$branch" ]; then folder="current" else folder="${branch}" fi ROUTE="/usr/share/nginx/somewhere" REPO="reponame" FINAL="${ROUTE}/${REPO}/${folder}" git --work-tree=${FINAL} checkout -f $branch done
You will need to give your git user permissions to write to those folders, for instance:
chown git.git branchpath
1 note · View note
zigotica · 10 years
Photo
I hope you will love this one as much as I do
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
Sooo, like two months ago I promised myself that, whenever I went outside, I’d bring my beloved 5-year-old GF1 camera with me (“I’m going to take photos all the time”, I said to myself). I can’t show you pictures of the moment I broke my oath because, of course, my beloved 5-year-old GF1 camera stayed at home the whole time, unaware of my betrayal.
A couple of days ago, however, after successfully bringing my photographic device to some places and making click click click several times, I decided to give it another try.
I also created a photo diary, because I like failing bigger every time.
I hope you like it.
7 notes · View notes
zigotica · 11 years
Text
GitLab Repository Management (book)
I've been reading GitLab Repository Management lately [disclaimer: courtesy of packtpub]. It's a nice book, well organised and easy to follow instructions to setup your environment, plus some basic details for a complete workflow. The book is very nice and well worth it. If I had to choose weak points I would say:
More examples. Never enough of these.
More on Hooks. I'm a bit biased here, ok. I've had some troubles with post-receive hook (somewhere between gitlab 4 and 6 the default post-receive hook was lost in newly created repos, don't know who to blame here, but I expected the book to acknowledge the fact, at least, when mentioning the post-receive hook).
More info on sidekiq. Sorry, any info on background processes, there is none.
As I said, the book is very worth reading. Good work and thanx for sending the book for review!
0 notes
zigotica · 11 years
Link
A protip by zigotica about javascript, css3, transition, and transitionend.
0 notes
zigotica · 11 years
Text
Sass, Compass and source maps in Webkit Devtools
If you want to use the awesome source maps for Sass in Devtools, you will have to:
Go to chrome://flags/ and Enable Developer Tools experiments, then restart Chrome.
Open Devtools and check Enable source maps in General tab and Support for Sass in Experimental tab.
Download latest Sass from the console: gem install sass --pre (you might need sudo)
Just adding sass_options = {:sourcemap => true} to config.rb in Compass won't work (just yet) and you can even get Compass conflicts with the aforementioned Sass alpha version. In my case, Compass 0.12.2 (Alnilam) does not get on very well with Sass 3.3.0.alpha.103 (Bleeding Edge), though it might be some other gem conflict.
For that reason, we will have to watch changes using Sass directly, using sourcemap option: sass --watch --sourcemap sources/compass:public/css which generates a .map file for each source. This is the information Devtools will use to let you trace/edit original Sass files directly in the browser.
2 notes · View notes
zigotica · 12 years
Text
Checklist to install multiuser development environment in OS X Mountain Lion
This list in not exhaustive, just my bare minimum for frontend purposes. I’m not dealing with Rails, PostgreSQL, MongoDB or other server tools since it’s not my day-to-day use. Just yet. It’s surprising that you need so many console tools these days to do frontend, but things fall apart, and most of them really improve your performance in many ways.
Installation for one or multiple users would be the same, but there are tasks we will perform as admin user so that we can share part of the environment across multiple users, who will only need to configure personal data (Git, $PATH, Sublime or other editor, …). We are using Mountain Lion but the following should work the same in older systems, down to at least Leopard or Snow Leopard.
Developer environment installation by admin user:
If we are going to use the complete Apple developers set up, including iOS simulators, we need to install XCode (from the Mac Store), then open it, preferences, downloads, install command line tools and the needed simulators.
If we don’t need iOS simulators, just go to https://developer.apple.com/downloads and download, then install the Command line tools for your system. It’s around 200Kb only.
Open a Terminal window and check if you have properly installed build tools:
which gcc
(it should return /usr/bin/gcc)
Then we will install homebrew, a nice package manager for OS X, from the terminal:
ruby -e "$(curl -fsSL https://raw.github.com/mxcl/homebrew/go)"
We are going to install Git now to have a control version in our projects without the need of a centralised repository:
brew install git
Update homebrew now:
brew update
Install rbenv to manage different versions of Ruby:
brew install rbenv
To use Homebrew's directories rather than ~/.rbenv add to your profile:
export RBENV_ROOT=/usr/local/var/rbenv
Then
brew install ruby-build
We are going to install Compass, a CSS framework that uses Sass stylesheets (make sure you code your Sass with 'compass watch' running in a terminal tab):
sudo gem install compass
Install node, several development tools are distributed as npm these days:
brew install node
We recommend prepending the following path to your PATH environment variable to have npm-installed binaries picked up: /usr/local/share/npm/bin
We love Grunt, a Javascript task runner. We use it to build our frontend files, minimize, JSlint and several other features. From version 0.4 on, command line tools are not included, you need to install separately:
npm install -g grunt-cli
Now you can install grunt locally on each of your project's folder (even a specific version for each project). Best approach is following offical Getting started and create a package.json, then use npm install --save-dev to install npm modules locally
PhantomJS is used to create files from Javascript in terminal, in one of our favorite tools, named Grunticon (allows us to throw SVG icons into a folder and convert them into datauri inside your CSS, with PNG fallback). No more sprites :-)
brew install phantomjs
Install nanoc, a nice static site generator
sudo gem install nanoc
If we need the preview server of nanoc we need to install it::
sudo gem install adsf
We can use other preview servers, like Apache from our system, MAMP or even a LiveReload server.
Install MAMP, which lets you have Apache, MySQL and PHP environment in a second.
We are going to configure /etc/hosts file according to this tutorial: How to use vhosts in Wordpress development.
Install Go2Shell (from the Mac Store), which opens a terminal window to the current directory in Finder. We just need to drag the app to the icons bar in Finder.
We install Chrome now, with LiveReload, Web Developer, YSlow plugins.
If you're going to use Wordpress, I recommend installing wp-cli so you can perform admin tasks from the terminal. You will need to add an alias for mysqldump
sudo ln -s /Applications/MAMP/Library/bin/mysqldump /usr/local/bin/mysqldump
Now you can go to your wordpress project folder in terminal and, for instance, perform a database backup just typing
wp db export
Developer environment config for each non admin users:
We have most of the environment set up ready, we just need some personal configurations. These will be performed by each non-admin user of the machine:
Open Finder and customize sidebars
We are heading to our personal folder ~/ and create an Applications folder. We are going to use this to install our programs, since /Applications will be used by programs installed by the system or admin for all users.
Install Dropbox (disclaimer, this is a recommendation link, both you and me will get extra space). Dropbox will let you have some Gb of your data backed up in the cloud and synchronized across your computers. It will require admin permissions to the folders in order to be installed.
Open /Applications folder and drag the app to the icons bar in Finder.
If you’re like me and prefer your finder to reveal hidden files, open Terminal and type:
defaults write com.apple.finder AppleShowAllFiles -bool YES
killall Finder
Copy your public/private SSH keys from an older computer. If it’s the first time you need these (Git, for instance), just create your keys.
To configure your Git user, we can copy ~/.gitconfig from an older computer into ~/ or:
git config --global user.name "Your name"
git config --global user.email [email protected]
Install SublimeText2 into our personal ~/Applications. You can buy it too! Or use Textmate, vim or whatever.
We are enabling Sublime access from terminal, pointing to a symlink in our bin (beware, link takes to the Sublime we just installed in our ~/Applications)
ln -s /Applications/Sublime\ Text\ 2.app/Contents/SharedSupport/bin/subl /usr/local/bin/subl
If we open a new terminal window or tab (cmd+t) we should be able to open Sublime by typing
subl
Install Sublime user prefs from an old computer, or use mine if you like them https://gist.github.com/zigotica/4954546
Install the package with color and typography settings (I keep a copy in Dropbox), usually found in
“~/Library/Application Support/Sublime Text 2/Packages/Color Scheme - Default”
Install Package Control from View > Show console, following these instructions
We can now install these packages from Package control: Emmet (previously known as Zen code), SuperCalculator (nice for px/em conversions), GitGutter (awesome, to see changes in gutter from last Git commit)
Install your .bash_profile from an old computer into ~/ or https://gist.github.com/zigotica/4523081
Now you check if your .bash_profile includes PATH to our bin, grunt, and other tools:
export PATH=~/bin:/usr/local/share/npm/bin:/usr/local/share/npm/lib/node_modules/grunt/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin
Alternatively, much better approach is separate each path into a specific var (thanx @happywebcoder for the tip):
export MY_BIN_PATH="~/bin"
export PATH="$MY_BIN_PATH:$PATH"
export NPM_BIN_PATH="/usr/local/share/npm/bin"
export PATH="$NPM_BIN_PATH:$PATH"
export GRUNT_BIN_PATH="/usr/local/share/npm/lib/node_modules/grunt/bin"
export PATH="$GRUB_BIN_PATH:$PATH"
Open Chrome and authenticate with your login/password in order to start sync'ing our favorites, plugins, and so on.
If we are going to develop responsive sites (or sites adapted to be viewed in mobile gadgets) you will want to install Adobe Edge Inspect which lets you synchronize what you see in your browser to your mobile device, instantly, and even inspect your device for a debug session.
Other programs you may like:
I don’t work nor get anything from the following, I just think they are awesome apps:
We love TotalFinder, your Finder with Tabs! Buy it, it’s worth it.
CloudApp is an awesome service that stores screen captures and other files in the cloud.
Buy and install Transmit FTP, but don’t forget to use Git locally :-)
Buy and install Divvy a cool app that lets you manage your windows size very easily with shortcuts.
1 note · View note
zigotica · 12 years
Text
iOS phone number styling
iOS has an automatic feature to detect phone numbers and link them to a phone call by clicking on them.
The problem with this is twofold:
Many false positives 
Style for these numbers is really ugly (ugly as in default link style).
OPTION 1
If you want to avoid this feature you can add the following meta in your pages:
<meta name = "format-detection" content = "telephone=no">
which will avoid the phone to make calls to ALL phone numbers in your page. To reactivate this key functionality in some of the numbers, you need to add a link around each telephone number in your page, manually, using tel: as the beginning of the href, as in:
<a hef="tel:1234567890">1234567890</a>
Then we just need to style the special links:
a[href^="tel:"] {color: inherit !important; background-color: inherit !important;}
OPTION 2
Since in some projects you will not be able to wrap numbers manually and you still want to style what iOS uses as phone numbers, you can do it the other way around. Let's leave the OS add the special link for us (will not add the special meta that avoids this feature) and style how it looks. First you need to know how iOS links the numbers to the phone. Before:
1234567890
After:
<a hef="tel:1234567890" x-apple-data-detectors="true" x-apple-data-detectors-result="1"> 1234567890</a>
As you can see iOS wraps the number into a link with special attributes. Now you can just style them using attribute selector:
a[x-apple-data-detectors=true] {color: inherit !important; background-color: inherit !important;}
1 note · View note
zigotica · 12 years
Text
Add custom poster frame to youtube video without the API.
The trick is adding the iframe in a comment. Javascript reads comment contents and saves iframe definition to a var. When user clicks on the image, javascript overwrites container innerHTML using the iframe defined in the comments.
https://gist.github.com/4438876
This code is also valid in browsers not supporting window.postMessage (API uses postMessage).
0 notes
zigotica · 12 years
Link
amazing compilation
Books
Mindstorms: Children, Computers, And Powerful Ideas: a must read.
Computer Science Unplugged
Lauren Ipsum, “A story about computer science and other improbable things.”
Projects / Platforms
Mozilla Hackasaurus
Mozilla Hackasaurus: “Hackasaurus makes it easy to mash up and...
5 notes · View notes
zigotica · 12 years
Link
Awesome post by mrmamba:
Siempre es necesario tener backup de nuestros sitios, pero a veces es algo costoso y da mucha pereza gestionar el mismo, que si hacer ssh aquí o allí, etc.
Comparto con todos una solución sencilla para gestionar el backup de un sitio web y su base de datos haciendo uso de Dropbox, rsync y tareas...
4 notes · View notes
zigotica · 12 years
Text
SpainJS 2012 in a post
Finally a Javascript event in Spain. I was tempted to organize one with some friends, but in the middle of the process we knew about this idea from the organizers of previous Rails conferences and also an attempt at Teambox to organize one as well in Barcelona. We postponed the idea, and frankly, the results for this first event in Madrid have been excellent. Kudos to the team!
THURSDAY
Workshops were cool. I could only attend the afternoon workshops, but they were really worth it. I was not asked for the registration ticket, which surprised me a lot since there were quite a lot of twits about only accepting registered users being allowed.
Dani Mata (@danimataonrails) talked about using Node.js, Express, Sequalize and Jade to create an API that would serve JSON. It was quite interesting, high level from the first moment, but given there was a bad wifi conection that made it slow to download all the dependencies and that I later had a mysql problem I was not able to finish the examples. Quite good anyway. Slides and code here: http://danimata.com/spainjs/
Guillermo Gutiérrez (@ggalmazor) managed a refactoring Dojo. Being my first dojo I can only say it was fun, worth it and inspiring. I suppose dojos are better with a smaller groups (at least not all strangers), but it was real fun, so fun I even went for a stint. Gotta do more of these. PDF here: https://t.co/18N2BhzI
For next year I would suggest the organization to force Camon to provide a better connection, during workshops wifi is more important than at the event talks. It was not on pair with the workshop contents. 
FRIDAY
Registration process next day was a bit slow and chaotic. Separation between conference ticket and conference + dinner should have been made before the room entrance, there was place enough to separate those into two different spaces. Once inside we faced the problem of getting a decent wifi connection, more on that in the final words. 
Vicent Martí (@tanoku) from github talked about github’s robot, hubot, and how cool it is to perform all kinds of tasks, scripts, domotic integrations, laughing at your coworkers and many more. Not much code was shown (the comic style in slides was gorgeous) but when it was, it fitted well, explaning in few lines how every task is a module with a name, a regexp and a callback. Hubot is opensourced, so you should give it a try. The talk was fun and inspiring. Slides here: https://speakerdeck.com/u/tanoku/p/intergalactic-javascript-robots-from-outer-space
Jeremy Ashkenas (@jashkenas) the creator of Backbone and CoffeeScript talked about the evolution of javascript (specifically, Harmony) and the benefits of using CoffeeScript. I’m not a big fan of compilers, but I have to agree that he knows how to lead a group with his speech, very dynamic, to the point. I liked the splats and soaks examples, and the fact that you can show the inner codefile using --tokens or --nodes options. Slides and code here: http://cl.ly/HwNW
Karolina Szczur (@karolinaszczur) from nodejitsu talked about simplicity in design, more specifically in CSS writing via frameworks. I don’t think I understood the point of the talk. She seems very good designer, but her presentation lacked focus. I know CSS, I use frameworks, I use compass. She talked about 3 frameworks, leaving many behind, she talked about CSS generators and then about Sass, which are quite incompatible from my point of view. Sorry, I don’t buy it.  Slides here: https://speakerdeck.com/u/karolinaszczur/p/the-pursuit-of-simplicity
Jakob Mattsson (@jakobmattsson) from Burt started with an unfortunate joke about Berlusconni and Francisco Franco. Really, out of place and not fun at all. Then he talked about something he really knows: writting a RESTful API with Node.js. The speech was nice, lots of tips, tricks and suggestions of other middleware like connect and code examples of how to use node, express, connect, mongoose together, in a lot of code slides too. I wrote down this sentence: “db schemaless is a lie”. Slides here: https://speakerdeck.com/u/jakobmattsson/p/writing-restful-web-services-using-nodejs
Horia Dragomir (@hdragomir) from Wooga was the man of the day (shared honour with Vicent Martí). He gave a speech on how to create fast mobile UIs, with plenty of code examples, short and to the point, like using event bubbling, HTML5 APIs, touch events, viewport scaling, and so on. If that wasn’t enough, he is very enthousiastic and dynamic, making the after lunch hour very easy to handle. Top. Slides here: https://speakerdeck.com/u/hdragomir/p/fast-mobile-uis
Tomás Corral (@amischol) from Softonic was the poorest talk on friday and maybe the whole event. Language was by far the main struggle for him, since he knows his stuff, but English wasn’t helping. Subject was not very well chosen, either. Minimization of code is important but I don’t think it gives for more than a 10 minutes talk, and the examples were all about semicolons, using variables and concatenating strings. I would have preferred him to talk about javascript architecture, since he knows how to write that (he is the author of hydra.js). Slides here: http://www.slideshare.net/amischol/less-is-more-13574571
Christian Kvalheim (@christkv) from 10gen talked about a Pacman game clone written in Node.js. The presentation was fine, going from the evolution of the original game to the challenges he faced while cloning it for web and techniques used (Node.js, Akihabara, SoundJS, websockets and MongoDB), also what is in horizon with WebGL and so on.  He also demo’ed the game. I am not a big fan of gaming, but the inners were interesting. Slides here: https://speakerdeck.com/u/christkv/p/mongoman-a-nodejs-powered-pacman-clone
Friday night dinner and party was excellent. Sala Garibaldi was a wise choice, center of town unlike last year in Rails conference, held in Florida Park itself (far from anything else). As usual, dinner was standed, ala cocktail style, good for talking with anyone. And the Jamón guy was there again. Good. A bit expensive, but I assume you pay for the networking more than the food.
SATURDAY
Ramón Corominas (@ramoncorominas) is a W3C's WCAG WG member. He gave a talk on WAI-ARIA to create accessible sites. I personally don’t buy the subject, but he also gave terrible examples of javascript being used to connect WAI-ARIA to form inputs. Either he did so to show a (doubtfully) simpler code or he does not know how event managers and event bubbling work. A pity, since the other stuff seems interesting, these weighted too much for me. Slides and code: http://ramoncorominas.com/spainjs/#spainjs
Alex MacCaw (@maccman) from stripe and creator of spine.js gave an excellent talk on perceived speed and how to improve it for a better user experience, giving real world examples on how other sites or apps trick about it: twitter, facebook, instagram or a cool blog system named https://svbtle.com/ The three steps to improve perceived speed by using an async UI are: rendering on client (batch DOM updates), store state and data on client and only then communicate with server asyncronously. Also nice trick on the questions about unit testing UI and REST API separately. Slides here: https://speakerdeck.com/u/maccman/p/asyncronous-web-interfaces
Keith Norman (@keithnorm) was very inspiring. He talked about the process started with his colleagues at Groupon to share code between Node.js and the browser. That is, writting javascript once and use it both on client and server sides. Showed us how they were using browserify to convert coffee/backbone code in server to allow node+express to run it. Very good, maybe the man of the day, but that one is tight, saturday talks were frankly top level, I might give it a draw with Nuria Ruiz, Alex MacCaw and Javier Arévalo. Slides here: http://keithnorm.com/spainjs-pipedream/#/ and a demo https://github.com/keithnorm/SpainJS-Pipedream-Demo
Nuria Ruiz (@pantojacoder) explained us the processes started at tuenti to improve performance through changing how they load javascript and templates. The thumbs up on the talk are not only the solutions they arrived to (which improved the speed of next tuenti version by an incredible 500%) but also the explainations on the process and how they discarded other options to finally chose YUI lazy loading with Handlebars templating. Oh, and keep it in a post it next to you: measure everything. I have the honor to have chatted with her about oceanography and research vessels, remembering old times when we (separately) were one month on board doing science stuff. Old times, go beat that! :-) Slides here: https://speakerdeck.com/u/nuria_ruiz/p/client_side_rendering_is_not_so_easy
Brian McKenna (@puffnfresh) presented Roy, a javascript compiler. Since I am not a big fan of compilers I took that time to do some networking, so I can not give my opinion on this one, other than my admiration goes for those who innovate, and that seems to be the case. Slides here: http://brianmckenna.org/files/presentations/spainjs-roy/
Javier Arévalo (@TheJare) kicked youngsters arses. If you ever thought “viejunos” (older people) should go home and let 20yo do the job, think twice. This guy knows his stuff. He deeply went through the use of different strategies, HTML5 APIs, WebGL, audio, events, performance… Nice talk indeed, more than one and two attendees paid the tickets just to see him in action, and they were not defeated. Slides here: http://www.iguanademos.com/Jare/docs/html5/SpainJS2012/
Jonathan Azoff (@azoff) presented tacion, an app that pushes sync’ed content to an audience using websockets and a jQuery mobile presentation. Very cool idea. He went through some of the code. The nice part is being opensource and using pusher.com which I did not know of yet. As a slightly negative side, it needs improving on the case scenario where the driver of the talk wants to let users run the content but up to a point protect certain pages. Powerpoint is dead. Slides here: http://azoff.github.com/tacion.js/examples/spain.js/#slide=0&step=0 
Final words
Overall the event was very impressive, well organized, nice talks, nice people… I would probably force lightning talks to be more technical and less “this is my product, buy it”  but the event is good. Florida Park seems a good place for me, having lunch in the grass is a win. The worst part of the event was this (in my humble opinion): once there, we found out that people at home was following the event via streamming. This being a lack of respect to the attendees, it also swallowed some big chunk of the bandwith. I suggest next year organizers to register every talk in video and upload them online two weeks after the event, edited with sponsors’ logos, improved sound, and so on. Every one would win. Oh, and please choose another ISP, Movistar clearly is not up to the standards or they simply laughed at you, I hope you will not pay the bill. Thank you anyway for such a great event, one of the best I’ve attended, see you next year!
6 notes · View notes
zigotica · 12 years
Link
(creo que voy a terminar compartiendo todos los posts de MrMamba…)
Siempre es interesante poder obtener todos los datos posibles de nuestra máquina Linux sin tener que abrirle las “tripas” o reiniciar y mirar en la BIOS.
Vamos a conocer algunas formas de obtener toda la información del harware de nuestra máquina.
lshw
por ejemplo podemos saber datos como el...
5 notes · View notes
zigotica · 12 years
Link
Hace tiempo un buen amigo me comentó que necesitaba hacer un backup regular de la BBDD Mysql de su tienda y recibirlo por correo electrónico. Rebuscando por internet encontré una forma bastante elegante de hacerlo:
#!/bin/bash
fecha=`date +”%Y-%b-%d”`
mysqldump —user root —password=12345...
108 notes · View notes