A list of experiments with technology and other things that pique my interest.
Don't wanna be here? Send us removal request.
Text
What is sys.stdout.flush()??
For Python developers out there, one day you might come across a script on github or some other repository where the developer of the script used the command sys.stdout.flush(). You might think to yourself.. What the hell does this do? So you look it up and you get some jargon about flushing the internal buffer... to understand the whole command we also need to understand stdout. On Unix systems stdout is a pipe that allows data to pass from main storage to your terminal (by default, but can be changed). This is the pipe that your print statements use and you could also use it to write to log files, etc. if needed. Okay so now what does stdout.flush() do.
So you have a programming running and it generates a list of integers. These integers are going to be printed to the terminal in one line. So we iterate through the list with a print statement and voila. we print out the whole line. What is really happening here is that the integer in the list is being places in the internal buffer (to simplify its like storing it in the pipe) and then its piped to the terminal. To help understand lets run this in the terminal. $ python
>>> import sys, time >>> for i in range(12): ... print i, ... time.sleep(.1) ...
you will see a sequence of numbers at the end of the loop.
Now try it with sys.stdout.flush() >>> import sys, time >>> for i in range(12): ... print i, ... sys.stdout.flush() ... time.sleep(.1) ...
See the difference? We just forced the data in the internal buffer to be dumped.
Another pipe available for use is stderr - check it out to see how it works. Feel free to comment below. Thanks for reading!
0 notes
Text
2-factor authentication for IoT.
Recently I took part in the DeveloperWeek Hackathon in San Francisco. I had wanted to take part in a hackathon for a long time and I was finally going to be in town and had someone to go work on a project with. We decided to build a 2-factor authentication system for the Internet of Things because we had both seen the recent doll house order through amazon alexa that made the news. The goal was to build an Alexa Skill and a python module that would allow us to abstract all of the 2-factor authentication from the code. SMS Messaging For the python module, we used the Flowroute Messaging SDK to build our 2-factor authentication system. It provides nice helpers and controllers to easily manage your service. We set up a developer account and picked a number for our service and we were ready to go. It didn’t take much work on the Flowroute side to get up and going and if you are interested, there are plenty of tutorials available online for using their service. Once we had the Flowroute messaging working added in the 2-factor logic of generating a number and exposing it as a method. We could then add this to our Alexa Skill we were going to build. Alexa Skill Regarding the Alexa Skill, we wanted to create a fairly simple skill to show the use case we intended for 2-factor authentication. We built a small information portal that would generate sales data for you. To manage our alexa skill we used the alexa flask-ask python flask modified framework that helps you easily develop an alexa skill. We followed this tutorial to help get ourselves started and then customized our skill. The code for our skill is available on github here.
Result
We made a github page and hosted our code in a repository for others to use or reference. https://bypass-auth2.github.io/
https://github.com/bypass-auth2/bypass_auth
0 notes
Text
Salesforce and Quickbooks Integration
I recently started a business and the premise is that we can integrate everything and automate everything. This is in fact the goal and the driving force behind the business. So to start off, why not take on a project that requires integrating an application with the two monoliths, Quickbooks and Salesforce. The client wants a customer portal that will show quickbooks data and salesforce data based on the customer that is signed in. I played around with two potential ideas 1. Download all the client data and seed a database that we will control and maintain. A daemon (or similar) will be used to poll Quickbooks and Salesforce for updates to the database. 2. Make an api call to quickbooks and salesforce with the user data that we store in a database when the user logs in. The results of the query will then be stored in a database as JSON with the account name as the key and a timestamp, sort of like an extended cache system. Then, after a certain amount of time passes, the past query is determined expired and a new query will be made when the user logs in.
We investigated both of these options and spoke more with the client about their expectations and what we found was that
quickbooks api cannot work without getting an application certified and registered with them
there will not be a lot of traffic or a lot of updates to the data in Salesforce or Quickbooks and
The client has the expectation of a tight turn around (under 2 weeks)
In the end, the decision was made to go with version 2. There were a lot of hurdles, here is a rundown on how we made it work with the following technologies: Google App Engine (more on this here) Python Flask Firebase (more on this here) AWS S3 Excel Spreadsheets Salesforce API + SOQL Python and Flask are used to manage the backend of the service. My previous post discusses the use of AJAX and jQuery to interact with the backend. After a user is successfully authenticated using firebase (blog post here), the backend has information about the user’s email, name and their token. This is stored in a claim object, but the claim object does not have information about the user’s salesforce id, quickbooks id, region etc. because this is information the user does not know and will not sign up with. In this case we had to make the connection between the two. So the first hurdle is managing the users data needed for quickbooks and salesforce. In this case we created a google datastore model of the user’s information and will use the user’s email to query the information. # [START User] class User_ext(ndb.Model): # NDB model class for a user, which contains their information sf_id = ndb.StringProperty() qb_company_name = ndb.StringProperty() qb_emails = ndb.StringProperty() account_owner = ndb.StringProperty() account_email = ndb.StringProperty() region = ndb.StringProperty() # [END User] now when a user logs in we can get their sf_id and the qb_company_name which we will use for getting their information. This information can be easily added or updated using an admin backend like flask-admin. or programmatically added by seeding the database. Now that we have the user information we will make our queries to salesforce and quickbooks.
Salesforce
Salesforce ended up being the easier of the two. Using a python module called simple-salesforce I could easily access the salesforce instance and make salesforce queries. There are some particulars with SOQL that make it more difficult to use than a vanilla SQL. I am not an expert here, but with the developer tools from salesforce and some nice walkthroughs online, it is fairly easy to get what you want back. I ended up querying for the opportunity line items for a particular client using their salesforce id. Here is a very straightforward walkthrough of getting started with simple-salesforce from Salesforce themselves. Once the data was pulled from salesforce it was stored into an object that would be sent back to the front end with the other user information. here is an example code snippet of a query: sf = Salesforce(username='username', password='password', security_token='this-is-my-token') #first query for opportunity line items by account id op_id_payload = sf.query_all("SELECT id, account.name from opportunity where account.id = "+ "'" + customer + "'")['records'] # get list of opportunity ids op_ids = [] for op_id in op_id_payload: op_ids.append(op_id['Id'])
# another query for the product information based on opId data = [] for opId in op_ids: op_line_item = sf.query_all("SELECT Id, Name, (Select Id, Quantity, TotalPrice, PricebookEntry.Product2.ProductCode, PricebookEntry.Product2.Name From OpportunityLineItems) From Opportunity Where Id = " + "'" +opId+"'")
Quickbooks
Quickbooks quickly became another story... After playing around with a few different python modules and understanding how to query from quickbooks, it became apparent that we couldn’t actually go this route because releasing a production app for quickbooks would take too long and too many resources. We had to find another way. The solution - have the client upload 2 reports generated from quickbooks every week or so. The reports would be parsed based on the customer information and it would essentially act like a datastore. the question became, what was the easiest and fastest way to allow our users to upload the required documents and how should they be parsed because there was a lot of mismatching information between inventory on hand, company name and products on invoices. A very robust method was needed to match this information.
Upload
S3 to the rescue. using s3 and flask admin, we could create an admin page that allowed the users to upload the documents to an s3 bucket. This is easy to set up with
tutorials on s3
and a
flask-admin plugin
.
Download
Based on the workflow we decided upon, when a user logs in, the request will then be made first to the database, and if nothing exists or the record is too old, it will get the information from the latest uploaded files (transactions and inventory) with the correct name in the s3 bucket.
To do this, our flask backend requests the s3 resource and then will read the excel document . Below is an example of reading an excel file with xlrd and urllib from an s3 bucket:
def read_excel(filename, header_row = 0, start_row = 1, start_col = 0):
opener = urllib.URLopener()
myurl = "https://s3-<region>.amazonaws.com/<bucket-name>/" + filename
myfile = opener.open(myurl).read() # Open the workbook
xl_workbook = xlrd.open_workbook(file_contents = myfile)
# List sheet names, and pull a sheet by name
sheet_names = xl_workbook.sheet_names() sheet = xl_workbook.sheet_by_name(sheet_names[0])
# Pull the headers using the provided inputs by index
headers = [sheet.cell(header_row, col_index).value for col_index in xrange(sheet.ncols)]
# make a dictionary with the headers
dict_list = []
for row_index in xrange(start_row, sheet.nrows):
d = {headers[col_index]: sheet.cell(row_index, col_index).value
for col_index in xrange(sheet.ncols)}
dict_list.append(d) return(dict_list)
return dict_list
Parsing
Pandas, numpy and google app engine. I usually use pandas and numpy to manipulate data in csv or excel formats. Its quick and easy and saves a lot of time. Unfortunately google app engine only allows numpy==1.6.1, which does not play well with pandas, which requires numpy=>1.7.1. So this meant I had to do it the long way, but what is probably the better method with less dependencies. Part of this is in the example listed above.
The next hurdle was matching the names. There would be one variation of the company name from our database of customers and we had many variations of the name in the transactions spreadsheet as well as the inventory spreadsheet. I used a combination of regex, fuzzy string matching (thank you
fuzzywuzzy
), tuning and unambiguous name portion matching to create a ranking algorithm of whether names were a “match” or not. The matches are then returned as objects which are stored in the database of records with a timestamp and then returned to the front end with other user information.
Conclusion
After working through the various hurdles, the conclusion is that the quickbooks API is a forest and very difficult to find your way through and their method of sandboxing the API does not allow for very easy integrations for small businesses. Although there are some integrations available on Zapier etc. they do not have a search functionality and are very limited. Salesforce’s API is much easier to work with and their developer tools allows for a much easier integration.
0 notes
Text
Using Googles Firebase for Authentication
Over the past 2 weeks I have been working on an application for a client. The application will allow the client’s customers to log in to a portal and see all of their forward contracts, payments, invoices and the current inventory on hand.
The customer data comes from two separate sources: Salesforce and Quickbooks - which I write about here. The client had a tight timeframe so I didn’t want to waste any time on authentication, which is not my expertise. I also didn’t want to expose any security holes. So I decided to try out one of the Authentication services out there, I chose firebase because I was already planning on using google app engine and it seemed like a nice solution. Plus the easy extension to an app seemed neat to me as well. Firebase uses the JSON Web Token, JSON-based open standard (RFC 7519). Which is essentially a protocol for generating, signing and authenticating web tokens. This system works well when you have an application that has a separate identity provider and service provider or perhaps many services using one identity provider. I won’t go into all the details here.
The Good
Firebase comes with a very easy Quickstart that allows you to get up and running with a few lines and some minimal configuration. This is awesome because you can essentially get up and running in a matter of minutes with a secure authentication service.
The Bad
Customization is not very easy for user pools and also for customizing the ui. Most of the UI is tucked away and requires a lot of digging around to get at. If you are making apps, this is a great authentications service. I enjoyed the ease of use, but will most likely try to roll my own JWT in the future so I can control more of the process and because I feel like the use case for business web apps is not what Firebase has been created for. I may try cognito instead.
0 notes
Text
AJAX and jQuery - Part 2
I spent some more time working with AJAX and jQuery since my last post. In particular I have been working with an app that uses python with the very customizable flask framework all hosted on googles firebase back-end as a service. The application is a customer login for a company that sells commodities via forward contracts. The app will display a user’s dashboard page and then allow them to view different information about their fwd contracts and the current inventory situation. The information will be served from the backend via an AJAX request to a python flask route.
So how does this work? First off, my application is organized as so where the main.py contains the route and logic to retrieve data from the database. and main.js and index.html provide any front end logic. root --backend ----main.py ----other stuff
--frontend ----main.js ----index.html
The flow goes something like this. When the user visits the url www.example.com they are redirected to an authorization service (firebase) and then after authentication, they are permitted to view their dashboard at index.html. The index.html dashboard will contain some event notifications, ads and then the option for the user to load some information about the contracts and inventory. The user can click on the inventory button or the contracts button and this is where we get into the AJAX. the following jquery code will make the ajax request on a button click.
$('#button').click(function(event) { event.preventDefault();
/* request fwd data from backend, passing auth token */ $.ajax( backendHostUrl + '/fwds', { headers: { 'Authorization': 'Bearer ' + userIdToken } })...};
so what happens here is that on the click of the html element with the id=“button”, an AJAX request will be sent to the <beckendHostUrl>/fwds with the header set with the authorization token. What happens next is we go to the backend, which in development may be set to localhost and in production some app.com. This brings us to a python flask route labeled /’fwds’. that looks like this (pseudo code)
# [START list_fwds] @app.route('/fwds', methods=['GET']) def list_fwds():
# Verify Firebase auth. id_token = request.headers['Authorization'].split(' ').pop() claims = google.oauth2.id_token.verify_firebase_token( id_token, HTTP_REQUEST) if not claims: return 'Unauthorized', 401
# get customer id from the database customer = query_db_for_customer(claims)
#request info from external api fwds = get_fwds(customer)
return jsonify(fwds)
# [END list_fwds]
What is happening here is that our AJAX get request is handled by this route thanks to our flask app.route decorator. We then authorize the request to prevent malicious get requests and after authentication, get some customer information and use this info to query an api. An external function will handle the api request and return us a list of dictionaries, which we then return as json using flask’s jsonify method. At this point, we will receive some data back from our ajax request. Back in the frontend, we can handle that data and print it to the console or display it as html. here is an example. $.ajax( backendHostUrl + '/fwds', { headers: { 'Authorization': 'Bearer ' + userIdToken } }).then(function(data) { if ( console && console.log ) { console.log(data); } };
Now we have our object displayed to the console, so we can verify that our request is working :)
0 notes
Text
AJAX and jQuery - Part 1
Being rather new to web development, it was high time to learn about AJAX and Javascript to understand and modify the JS scripts I was dropping into my websites. What is AJAX?
AJAX stands for Asynchronous Javascript And XML. Most often today it is used to retrieve JSON (JavaScript Object Notation), which has become a common method of transferring data. What does AJAX mean to me. It is a way to use javascript to retrieve information and display it rather than making an http request and then loading the page. The events used to trigger the javascript function can be as simple as moving a mouse down a page, there is no clicking needed! This equals a greater user experience IMHO. According to www.w3schools.com the process that drives AJAX is the following: 1. An event occurs in a web page (the page is loaded, a button is clicked)
2. An XMLHttpRequest object is created by JavaScript
3. The XMLHttpRequest object sends a request to a web server4. The server processes the request
5. The server sends a response back to the web page
6. The response is read by JavaScript
7. Proper action (like page update) is performed by JavaScript One limitation to AJAX is the same-origin policy restriction, which is a security feature and essentially limits how a script or document loaded from one origin can interact with a resource from another region. This can cause errors and limitations and is often over come with CORS. So how do we use AJAX? AJAX can be used with vanilla javascript, but is more commonly used with JQuery. But What is JQuery?
JQuery is a high-level javascript library that allows you to interact with a websites DOM (Document Object Model), which is an application programming interface (API) for valid HTML and well-formed XML documents (source: w3c). jQuery allows you to easily use AJAX, JSONP, bind to element based events, dynamically update CSS, attach data to elements, iterate over data, search a tree structure, extend objects to name a few. more functions can be found here.
How do we use jQuery?
In order to use jQuery we have to link our html page to the script using the <script src=“something.js”></script> tag and then including the source file. Then one must choose an event to trigger the function by specifying when to load using the window object. this can be done with window.onload = function() { alert( "welcome" ); }; or $( document ).ready(function() { // Your code here. }); the first method waits for all images etc. to load and the second loads when the document is ready to be manipulated.
One note is that the jQuery library exposes its methods and properties via two properties of the window object called jQuery and $. $ is just an alias for jQuery. (source: jQuery) For more information on AJAX and jQuery, try the API Docs and Mozilla.
0 notes
Text
init
This is a blog about the many experiments of my life.
An experiment, in my opinion, is any opportunity to learn something new. Often I pair my experiments with making or building something and I have found that writing about it is a good way to recall what you learnt.
0 notes