harrysmemo
Harry's Memo
216 posts
Cloud - Mobile - Windows - HoloLens Apps
Don't wanna be here? Send us removal request.
harrysmemo · 8 years ago
Text
Doctor Who Marathon
January 8th, 2017: Doctor Who or Star Trek?
I watched Doctor Who the whole weekend. How many episodes of the same show can you watch in one go? :)
I started watching "Modern Doctor Who" several years ago when it was first showing on Amazon Instant Video (I think). It was a very interesting show. It was reminiscent of Star Trek in many ways. The Doctor, in case you don't know, is a time traveler who goes through adventures, with traveling companions. When I started watching Doctor Who (played by Christopher Eccleston and David Tennant), the traveling companions were Rose Tyler, Martha Jones, and Donna Noble. (Yes, you do remember all these names. ;))
Tumblr media
Time travel has always been possible. In dreams. -Madame Vastra (Doctor Who)
This weekend, I watched the more recent episodes with Matt Smith and Peter Capaldi as The Doctor. Traveling companions were Amy Pond and Clara Oswald. I think I'm now in the middle of series 9.
Doctor Who is a science fiction. I see a lot of similarities with another favorite show of mine, Star Trek the Next Generation. Doctor Who has a bit of twists that originate from time travel like "paradox" and parallel universes, etc. But, at the end of the day, both shows are about encounters with different cultures and civilizations, and with different kinds of living beings, in different spacetime.
Both shows feature different types of fictional life forms and different cultures (based on our, or the writers', imagination), which we find rather interesting (otherwise, we wouldn't watch them).
But, the most interesting thing about these shows is that they are not really about aliens. They are really about humanity. Stories often unfold in remote planets with non-human creatures. Despite this fictional settings, the essence of the stories are really about us, people who inhabit the Earth right here.
Star Trek the Next Generation was not so much about space travel as about the study in humanity. For example, throughout the series, the question was posed as to what is human? Is Data human? Can an AI life form like Data "feel"? The episodes featuring Borgs were rather intesesting as well, which asked questions like individuality vs "group'ism" (I cannot think of an appropriate word for this, whatever is the opposite of individualism). Encounters with different civilizations always provided an opportunity for us (or, the crew of the Starship Enterprise) to reflect on ourselves more than anything else. (BTW, will there be another Star Trek? There were a few different spinoffs, but none were as good as The Next Generation. I think it's about time we had another Star Trek, "Modern Star Trek".)
Doctor Who is rather similar in that regards, albeit somewhat to a lesser extent. My brains are kind of mushy now ;) after staring at the TV screen non-stop for hours, but I always enjoy watching the shows like this which make you think. To the point where your eyes start to hurt. :)
Don't blink!
(Incidentally, I just remembered an episode from Portlandia, where the two main characters binge watch Battlestar Galactica for days, without eating, without sleeping, and even without going to work. ;))
0 notes
harrysmemo · 8 years ago
Text
-- Learning Lua
January 7th, 2017: One more programming language?
Lua is an interesting language.
It is a dynamic scripting language, similar to Python or Ruby, designed to be run solely as an embedded language. That is, you cannot write and run a "main()" program. It only runs within a "host" (usually a C program).
Tumblr media
A lot of why I do something is just the novelty of the experience. -Edward Norton
Lua is a very small language with a very small runtime. It is as if it was created just to interface with the underlying C runtime. You can call C functions easily and that's about it. All other features seem to be there just to support this single purpose.
Of course, I am a bit exaggerating. Lua is a very interesting language, with constructs like "tables" and "metatables", etc. (Everthing in Lua is a table.) It also supports features like coroutines which are not generally found in other "modern languages". Lua has been around for some time (in fact, it was created over 20 years ago), but it is only recently that it is getting kind of popular as an embedded script language among game developers (e.g., for Unity3D, UE4). (You can also use Lua with other game engines like Love2D. You can even do Web development using frameworks like Sailor.)
I took a second look at Lua partly because of my recent interests in coroutines (Is it a part of C++17?), and game (AR/VR/MR) development, among other things.
If you are interested, go to the Lua Home page and download the pre-built binaries on your platform. Here's my sample code illustrating the use of coroutines. You can use Lua interpreter to easily test Lua scripts including this sample code.
--[[ Sample program: "Is Lua good or bad?" ]] -- Number of repetions. limit = 10 -- Initialize the rng. math.randomseed(os.time()) co1 = coroutine.create(function() for i = 1, limit, 1 do print("- Lua is good") local r = math.random(1,limit) if(i >= r) then coroutine.yield() end end end) co2 = coroutine.create(function() for i = 1, limit, 1 do print("* Lua is bad") local r = math.random(1,limit) if(i >= r) then coroutine.yield() end end end) repeat -- print("a: " .. coroutine.status(co1)) coroutine.resume(co1); -- print("b: " .. coroutine.status(co2)) coroutine.resume(co2); until coroutine.status(co1) == 'dead' and coroutine.status(co2) == 'dead'
.
0 notes
harrysmemo · 8 years ago
Text
100 Days of “Sobriety”
January 3rd, 2016: One day at a time...
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, ...
100 days!
Tumblr media
Every day I feel is a blessing from God. And I consider it a new beginning. Yeah, everything is beautiful. -Prince
Celetrating a small victory today...
To another 100 days!
0 notes
harrysmemo · 8 years ago
Text
Unreal Engine: Choosing between C++ and Blueprint
January 2nd, 2017 (Day 99): Should I start game development with C++ API or Blueprint visual scripting for Unreal Engine 4?
I've been looking into Epic's Unreal Engine for the last few weeks as an alternative game engine to Unity 3D.
There are a lot of similarities between UE4 and Unity. Obviously, they are both game engines. You use either of these software to develop games, or VR and AR apps these days. Hence, they provide essentially identical functionalities. There are, however, some differences as well, mostly minor.
The biggest difference between two game engines is their programming models.
(BTW, Unreal Engine 4 is free to use. When you publish apps/games, and if you start making money above a certain threshold, then you have to pay 5% of the revenue to Epic as loyalty. This pricing model is very different from Unity.)
Tumblr media
Reality is wrong. Dreams are for real. -Tupac Shakur
In Unity, you create a game using the UI (Unity Editor), and you can script various game objects using scripts. The scripts can be written in a few different programming languages like Javascript, but C# is the most popular and widely used one.
In Unreal Engine, the general game creation process is the same. You use the UI (Unreal Editor) to create a game. But, the scripting model is slightly different. You can use C++ to create essential game objects (which inherit from Unreal base classes like AActor or UActorComponent, etc.), and you can use these custom objects in Unreal Editor. In Unreal, actual "scripting" is all done via Blueprint visual scripting. The phrase "visual scripting" is somewhat ironical since actually it's not really scrpting, at least not in the traditional sense. You manipulate various objects and control their relationships, etc., through GUI without having to write a single line of code/script.
When I started looking into Unreal Engine 4 (UE4) for the first time, I did not know this. I thought C++ and Blueprint were alternative, and (almost) mutually exclusive, ways of programming in UE4. There are a lot of articles on the Web comparing these two different ways of programming in UE4. (E.g., "C++ vs Blueprint in Unreal", etc.) Even the Unreal official docs and tutorials do not make this clear. But, they are all wrong, or at least rather misleading, in my view. (The answer to the title of this post, "how to choose between C++ and Blueprint", is therefore "you do not choose". You use both to create a game.)
The Unreal game creation process starts from creating base objects with C++ and use them in Blueprint visual scripting. In certain projects, you may end up using more C++ programming, and vice versa, but that's the basic model. In some game development, you may end up completely foregoing the C++ part, and you create an entire game using only Blueprint and Unreal builtin base class objects (I presume many people do), but still the idea is the same. (On the flip side, however, I don't think you can create an entire game using C++ only. Blueprint is an essential part of game creation in Unreal.) It's just a matter of choosing how much C++ you need, or want, to use in a given game project.
Note that this game, or app, development model of Unreal is rather different from that of Unity. Although it's probably possible in Unity as well (using plugins and what), you don't usually create derived classes of GameObjects (at least based on my limited experience with Unity). You generally change the behavior of a GameObject by adding "script components" (typically in C#). I think they probably implement this internally through composition or delegation. In Unreal Engine, the programming model dictates that you have to use inheritance. You create derived, or more specialized, AActors or UActorComponents (with custom event handling, etc.) and use them in visual scripting. (Both game engines seem to heavily rely on reflection to support their programming models or frameworks.)
Although Unity has a huge lead, in terms of the market share in the game engine market, especially among indie game developers, and it's a safer bet when you have to choose between these two game engines, I think it's still worthwhile to take a serious look at Unreal Engine 4. UE4 can provide more benefits than Unity3D, (A) if you are a seasoned C++ programmer on the one end of the spectrum, or (B) if you have no or little experience in coding on the other end of the spectrum.
0 notes
harrysmemo · 8 years ago
Text
Two Thousand and Seventeen
January 1st (Day 98): New start. New beginning.
The sun rose, on the first day of the year 2017.
Tumblr media
Clouds come floating into my life, no longer to carry rain or usher storm, but to add color to my sunset sky. -Rabindranath Tagore
2016 was a mixed blessing for me. Among the things that had been bad for me, besides all the personal problems I had, was that... I lost my focus.
Sometimes, I think about the fact that people live for about 100 years, give or take. We don't live 1000 years. Then again, we do not generally die only after 10 years. It's not too long, but it's not too short either.
We come to this world from nothing, and return to nothing. Some people live good lives, and some people don't. Some people leave something behind, and some people don't. Some people are remembered, and some people aren't.
When I was young, that 100 years seemed so long. Or, at least, more than "enough". Now, I feel like it's too short. I feel like I am already running out of time.... After all, how many "second chances" can you get?
I think I will have to make some changes in 2017.
0 notes
harrysmemo · 8 years ago
Text
Down But Not Out
December 31st, 2016 (Day 97): One more year...
It's a cliche, "down but not out". But, I feel like that's me.
After a long and arduous time of non-stop tries, and failures, for the last several years, the year 2016 felt like a bit of a break. Not an easy year, by any measure, but I feel like I'm not completely "down and out".
I still have a hope.
Tumblr media
Promise yourself: "Never, never, never give up." -Winston Churchill
Although the "year", and the calendar in general, is a man-made invention (based on the period of the earth revolving around the Sun), particular days of a year always seem to have certain special meanings to us. Birthday, Christmas day, New Year's Day, ...
Is 2017 going to be any different from 2016? Or, from the years before that?
I don't know. But, I sure hope so.
I sometimes feel like I'm a person in somebody else's dream in the movie Inception. I cannot remember what they were called, but they are trained profesionals who go into different people's dream and try to achieve certain objectives, often acting against the dreamer's interest (the "owner" of the dream?). The people, or bystanders, in that dream are generally neutral to these intruders. But, they are always on the side of the dreamer, and they become hostile to intruders once they realize they are intruders. Such a mind boggling fiction. A dream inside a dream inside a dream? :) Anyways....
I sometimes feel like I am not in my own dream. It feels like I am in somebody else's dream.....
Bye bye, 2016!
0 notes
harrysmemo · 8 years ago
Text
LED Pollution
December 30th, 2016 (Day 96): Random thoughts, with only a few hours left in 2016.
LED was invented decades ago. In fact, it's been commercially used for over 50 years. LEDs, short for "light emitting diodes", are used for various purposes. Recently, I see it becoming very popular, say, as a replacement of light bulbs.
Tumblr media
Black-and-white always looks modern, whatever that word means. -Karl Lagerfeld
Initially, the LEDs were primarily used as illuminating indicators or symbols, e.g., for digital watch display, etc. rather than as a lighting source.
I don't know if people will remember, but the traffic lights actually used light bulbs. It's only rather recently that they all started using LED based lights. LEDs, being solid-state components, last longer than, say, incandescent light bulbs. LEDs consume less energy. I remember reading a story about an inventor who made this idea a reality. Obviously, it did not happen automatically. There were people who made it happen, through sweats and persistence. An interesting thing about traffic lights is that the LED lights are not constantly on. They are intermittent, like blinking. The human eyes just cannot see that. But, as you can imagine, this saves even more energy compared to other light sources.
I recently started noticing the LEDs are becoming even more popular. If you go to a hardware store to buy light bulbs, you cannot even see incandescent light bulbs these days. You can see some halogen light bulbs, which were popular for some years. But, most of the light bulbs are now LED based. It feels like it happened almost overnight. I am not sure what happened. LED lights were always energy efficient than other alternatives. But, the current trend is rather surprising.
It looks like LED components are becoming ever cheaper. This has probably something to do with ever more popular use of LEDs. Like in the use in HDTV screens. The price goes down as they are more mass-produced. As the price goes down, they find more uses, which in turn drives the price down. That's the wonder of modern economy that started from the industrial revolution.
Anyways, one thing I don't like about LEDs are they are everywhere. How do you know if your TV is on or off (not just sleeping)? There is an LED light indicator. How do you know if your coffee maker is on? There is an LED light. How do you know if your heater is on in the dark? There is a small LED light indicator. What temperature is it set to? There is an LED display. Is your wireless modem currently functioning well? There are a bunch of LED light indicators for that. Your cell phone charger? Of course, it has an LED indicator. Even some batteries have builtin LEDs to indicate its current charge level. You name it. Better yet, think of one electronic device these days which does not use an LED or two.
Urghhhh...
Some lights, you cannot even turn them off....
Urghhh....
LEDs are everywhere.
0 notes
harrysmemo · 8 years ago
Text
“Seventeen”
December 29th, 2016 (Day 95): Is it really magic?
David Blaine is one of a kind. He is a magician. He is an illusionst. He is a street performer. He is.... I don't know what he is.
There was a premier of an interesting Bavid Blaine special on Netflix a couple of months ago, titled "David Blaine - Real or Magic" (originally aired a few years ago on ABC). It was OK, nothing spectacular, but there were a couple of interesting things that caught my eyes.
Tumblr media
There's no such thing as "perfect murder". That's an illusion. -Columbo
First, there was an intersting story about a guy who uses his stomach to carry water. He drinks water and stores it in his stomach. When needed, he spits out the water. I don't know if that is a true story, or whether such a thing is really possible. But, I thought that was incredible if it was real.
According to the show, David Blaine tracks him down somewhere in the world and learns his trick. He uses this trick to start a fire and then puts it out by spitting out an inflammable liquid first and then water. (He drinks both liquids, but you see, the inflammable liquid, presumably ligher than water, floats on top of water in his stomach and hence he spits the liquid out first before the water. At least, that was the explanation. Or, at least, that is how I remember it now.)
Obviously, it's a "magic", an illusion. Hence, we have to take it with a grain of salt. We will never know if that was what he actually did.
But, the more interesting thing was his last act. He visits President George W. Bush and performs a guess-a-number game. This is one of the oldest magic tricks. In fact, I was just watching an episode of Columbo, and it was part of the story, where a magician turns out to be a murderer.
In this David Blaine show, he asks the president to think of a number between 1 and 20, and he guesses it to be 17. The president Bush acknowledges that he guessed it right.
Wow, interesting, you might think. But, that's not the whole story. When David Blaine asked the president to think of a number, I thought of a number too. A random number between 1 and 20.
That random number I picked was also seventeen.
How is that possible?
The fact is that, he did not guess. He somehow seeded that number 17 into our brains. Something like power of suggestion. Or, something like the movie "Inception". (I wonder, How many people who watched the show would have thought of 17?) I couldn't figure out how he did it.
But, I think that is magic.
0 notes
harrysmemo · 8 years ago
Text
C++/WinRT (for UWP App Development)
December 28th, 2016 (Day 94): How to create UWP apps using ISO C++ on Windows 10.
I've been trying out many different C++ libraries with the vcpkg tool. Here's a collection of test apps I've tried (on GitHub): VC++ Packages.
Coming back to C++ development after a long long hiatus, this has been a good opportunity for me to learn the commonly used libraries in C++ these days. (BTW, I just learned, or re-learned/recalled, that long in C++ is not necessarily 64 bits. It's only "at least 32 bits". long long is guaranteed to be 64 bits. Albeit a trivial example, this shows what kind of issues we need to deal with with "cross-platform" languages like C++.)
One thing I've discovered from this vcpkg trial expereince is an open-source library by Microsoft that wraps UWP/WinRT APIs in C++, called C++/WinRT.
Tumblr media
The very essence of the creative is its novelty, and hence we have no standard by which to judge it. -Carl Rogers
This is, first of all, a rather interesting library. WinRT is a layer on top of .Net. (It's rather confusing to decipher all these mumbo-jumbos of Microsoft technologies these days. But, I think WinRT is built on top of both .Net Framework and .Net Core.) .Net, in turn, is built on top of the native API layer, Win32 on Windows. You generally use C++ to write native apps on Win32.
Now, it appears that this C++/WinRT library wraps the WinRT APIs and exposes them to the native C++ layer. As you can see, this is kind of paradoxical, kind of circular.
Obviously, a C++ app targeted to UWP using C++/WinRT will not run on Win32. It will only run on UWP (WinRT runtime, on top of .Net). So, what's the point of this C++/WinRT library then?
When writing a UWP app, you have to use a certain programming language, which compiles into CLR (.Net). C# is an example. You can also use Javascript. A .Net variant of C++, C++/CX, is another such language. As I claimed before, C++/CX is not really C++ despite their (almost) identical syntaxes. C++/CX is a .Net language that runs on .Net runtime. UWP APIs are available only for these languages.
What C++/WinRT allows to do, I think, is to write a UWP app in the standard C++ (aka "real C++"), which is not a .Net language and hence does not have direct access to UWP/WinRT APIs. With the C++/WinRT wrapper, now you can create a UWP app using WinRT APIs with ISO C++. I think, that's the idea, at least...
Check out the C++/WinRT project page for more information. I'll post some sample code or tutorials once I gain more experience with this library.
0 notes
harrysmemo · 8 years ago
Text
Static Library over Dynamic Library?
December 27th, 2016 (Day 93): Some random thoughts on "software libraries"
I always preferred dynamic libraries (.dll or .so, etc.).
Although there is no bright-line criterion to decide which option is better, static vs, dynamic, and it's really case by case, dynamic libraries tend to have more advantages in general.
Tumblr media
I believe life is an intelligent thing: that things aren't random. -Steve Jobs
If you are a library vendor, for instance, then dynamic library makes more sense. Suppose that you use MFC (or, whatever common and popular framework is these days) in your application, then using it as a dynamic library is almost a must. A user's system will have one copy of mfc.dll (or, a set of particular versions of it), say in a system folder, and all apps that use MFC can use (a certain version of) MFC that is already in the system. The apps that use MFC, for example, can be dozens or even hundreds on any given computer. Therefore, virtually all system libraries are built as dynamic libraries regardless of the operating systems for these obvious reasons.
On the other hand, if you develop libraries that tend be used only by your own apps or by a few others, static libraries seem to make more sense unless you have rather complex library structure. Reusability requirement (reusable by the code which hasn't been already written, for example) generally prefers dynamic over static. And yet, most libraries we write tend have rather small set of users (e.g., to be used by other developers in a team, or across teams in the same organization, etc.), and I think static libraries suits better.
The problem with static libraries is potential duplication/bloating of the code. Suppose that your app uses two libraries (dynamic or static), X and Y. X and Y, in turn, use libraries A, B, C, and B, C, D, respectively. If these libraries are linked statically to X and Y, your app will end up having two copies of B and C (or, whatever the obj files you end up using in your app). I haven't actually tested this theory, but I think that's how the linking is supposed to work. On the other hand, if you used dynamic libraries, A, B, C, and D, then you won't have that kind of code duplication problem.
When I create libraries, I almost always create dynamic libraries. The primary point of writing a library (rather than including everything in one app/exe) is to make it reusable (by my other apps or by others). Hence, it's in general a better option to create a dynamic library.
Recently, I've been thinking about this, and I gradually realize that maybe static library should be a default option. You should in general create a static library unless you have a strong reason to prefer a dynamic library. Just a hypothesis...
Modern platforms, especially mobile operating systems, are rather different from the ones which I was accustomed to when I started programming. On Android, for instance, apps are completely silo'ed. Sharing libraries is not even an option. The same with new platforms like universal Windows (UWP). There are certain exceptions in which you can share certain code between apps from the same publisher, etc., but it is rather limited. I hear that, although I have little experience with iOS development, you cannot even create a dynamic library on iOS. (Depeding on how you define these terms, that is also true for Java/Android and .net and UWP. This distinction, dynamic vs static, seems to make sense only on "native" platforms.)
Obviously, dynamic libraries provide other benefits as well. For example, suppose that your app use 10 dynamic libraries. At any given moment, only 3 may have been loaded into memory. Dynamic libraries can be dynamically loaded and unloaded while the app is running reducing the memory footprint. On the other hand, static libraries are linked into the app, they are just part of the app. There is no way to unload certain parts of static libraries at runtime. On the flip side, the app only includes the part of the static libraries that are actually used by the app during the link process, which is an advantage over dynamic libraries. Dynamic libraries, by their very definition, include everything because which client/app might need which classes or functions.
One other advantage of dynamic libraries (among the many I am not going to mention in this post) is dynamic versioning. In some cases, you can release a new version of a dynamic library (which, say, fixes a bug, or increases performance) without having to recompile the app(s).
Despite all this, I recently had some epiphany, and I started preferring static libraries. Fist of all, many advantages of dynamic libraries may not be that great in your particular case. Unless you are a library vendor, the "potential" benefit of using dynamic libraries are relatively small. You are already getting many benefits by using a library, dynamic or static, which can be reusable (either at compile time or at runtime).
Using static libraries has some advantages. For instance, simplicity of deployment. No (runtime) dependencies, etc. Depending on the runtime, this can be a huge advantage. Everything you need is included in your own exe, and you do not need to worry about where to find the dependent library modules. (Just to be clear, it's really the runtime's responsibility to ensure this, but when your app does not run on a particular user's computer, it does not matter whether it's the runtime or you to blame.)
While experimenging with Microsoft's new package management system, vcpkg, I also realize how much easier it is just to link static libraries.
Anyways, just a random thought... I doubt that I will ever switch to exclusively static libraries only any time soon.
Just a random thought... on a cold December night...
0 notes
harrysmemo · 8 years ago
Text
Back to C++ (And, “Value Type Based Programming”)
December 26th, 2016 (Day 92): Getting back to C++ development after a long hiatus, and feeling nostalgic...
My first real programming language was Fortran. As an engineering student, you were supposed to use Fortran (short for "Formula Translation"). If you were a computer science major, you might have started with the C language. (It was before Pascal, before Java, and before Python.) I don't know if people still use Fortran these days (I am sure they do), but I have not used it (like "never") for almost 30 years. BTW, When I first learned fortran, I used it on a mainframe using punch cards. The fortran syntax owes a lot to this punch card-based input mode (e.g., 80 columns), I think. (As a freshman I didn't have the priviledge to use "tele terminals".) You create a stack of punch cards, and submit it to the computer center. Then, after a few hours you get a bunch of printouts back with your program's outputs. That's how we programmed in the early days. The "dumb terminals" became more accessible in the early 80's, at least in the places where I learned programming.
(BTW, some of the favorite programs we wrote at the time was "ascii arts". There was no such thing as graphical user interface at the time. And, it was pretty fun to print out all sorts of "low res images" using characters using dot matrix printers.)
Tumblr media
Every act of rebellion expresses a nostalgia for innocence and an appeal to the essence of being. -Albert Camus
Fortran was the first high level language that saw wide adoption (invented by John Backus at IBM in the early 50's). Its success prompted invention of plethora of languages like Cobol and C, and all the other languages that followed. I started using C in the late 80's simply because it was new (to me). C (the language after "B") was originally developed by Dennis Richie at Bell Labs, primarily as an implementation language for Unix. I was doing mostly scientific computing (like numerical simulations) and C did not provide much advantages over Fortran. If anything, all existing code and libraries (that I had to refernce or borrow) were written in Fortran and C had significant disadvantages. But, I liked C. I liked its "simplicity". I studied and learned C from the original "The C Programming Language" book by K&R. It was like my bible. C was very different from the languages I knew at the time, like Fortran and Basic, and I found it rather difficult to learn. I considered it a challenge to become a good C programmer.
The C's success was clearly due partly to its access to low level system features like memory, etc., which turned out to be a curse and blessing at the same time. (I am not sure whether the "pointer" was first introduced in C, but certainly it was new to me.)
C++ was invented, initially, as a "better C", in the 80's. I didn't know about it until Bjarne Stroustrup's book "The C++ Programming Language" second edition came out in 1990. At the time, object-oriented programming was such a hot buzz, and C++ took off like a rocket. Virtually all companies started converting their C-based codebase into C++. Many system software, even operating systems, originally written in C was being converted to C++, or they were being wrapped in C++ APIs. All "new" software was written in C++, or they exposed C++ interface, like X-Window (with which I spent quite a bit of time learning GUI programming). Of course, C++ was not the only OOP languages invented, but the backward compatibility was such a big advantage which C++ provided, which largely accounted for its unique success, in my view. Bjarne's book became my next "bible".
When I started using C++, there was no good C++ libraries. You just used the C libraries (other than maybe some basic libs like iostream). (That's also the brilliance of C++. You could just use C's existing infra.) I don't remember when the first STL was invented, but it was pretty early. However, there was no good STL implementations (which included containers and algorithms, etc.). I think good STL libraries became available only in the late 90's. I used C++ as my primary language for over a decade. From the mid 90's, I started using Java more and more, and Java became my primary language in around 2000. (I started dabbling with many different programming langauges as well at the time, like Tcl/Tk, C#, etc.) The last time I used C++ for any serious projects was 2003 or 2004, if I remember correctly. Since then I mostly used Java and C# and Python, and other "modern programming languages". Even PHP and Perl, but not much C++. My C++ knowledge was really from pre-C++98, and hence although I used C++ for over ten years (probably something like 15 years if I include C) and I thought I was rather good, I never considered myself a C++ programmer since then.
C++ was a dead, and ugly, language in my mind. (Like, "Fortran" was a dead language to me.) I am relatively comfortable using various "modern" (therefore, somehow "better" in my mind) languages at this point. Why would I ever go back to C++? Ever?
Well, you never know.
Because of the projects at work, I just started using C++ again. I've been using C++ for about a month or two now, almost exclusively. (Or, a little bit longer, if I include my "hobby" projects that somehow rely on C++, which I've been working on, or merely thinkg about working on, for almost a year now.)
In my view, still C++ is, in terms of ecosystem and infra, a bit behind other modern programming languages. But, the language itself, say, in terms of grammar, has been rather "modernized", with recent innovations like C++11 and C++14, and upcoming C++17. Having used C++ a bit for the last several weeks, I now feel like you can really write a clean and safe code using C++. Don't get me wrong. C++ is an extremely complicated and messy language with a lot of historical "baggages" (many of which inheried from its parent "C").
But, if you really understand the essence of "Modern C++", it shouldn't be too difficult to use C++ to create "modern software", if you are a reasonably good developer.
One thing to note is that C++ is a very different language than other "modern C-style languages" like Java and C#, although they borrowed many concepts from C++. Obviously, that practice goes both ways, C++ adopting constructs from other languages, and vice versa. The end result is that these languages are getting more and more similar to each other. (For example, all these languages support generics, etc.)
But, there is one fundamental difference.
Most modern "safe" languages like Java and C# primarily use reference types. In java, only built-in types are value types. In C#, although you can create custom value types (struct), the language system is optimized for reference types. In case you are not familiar, objects of a reference type are created on the heap, and typically the memory is managed by the system (e.g., through garbage collection, etc.). On the other hand, instances of value types are created on the stack. (Just to be clear, a type in C++ can be used like a value type or a reference type or both at run time depending on how you have designed the type. This is not the case with C#, for instance, in which the distinction is fixed at the coding/compile time.)
C++ is a fundamentally value type based language. To reiterate, the language itself does not impose any restrictions. You can create a value type or a reference type (with the help of some language constructs, and following some conventions, etc.), or even something in between. But, fundamentally, C++ is supposed to be used primarily with value types. If you are not a C++ programmer, or if you are a so-so C++ programmer who do not know what you are doing :), then this may sound a bit counter-intuitive. But, C++ has always been a value type oriented programming language (with copy constructors and copy assignment operators, etc.). The "modern" variant of C++ makes it even more so (with the addition of move semantics, etc.). Are you still using raw poiters? Shame on you. Well, clearly, there are times when you will need to use raw pointers, and new and delete, but these are largely obsolete. Most of the time, you have to write your code in terms of value types, which makes it unnecesary to use raw pointers in the first place. (Without going into too much detail, C++ provides "references" other than pointers. (That is, you don't need to use pointers for polymorphic behavior.) And, even for pointers, we now mostly use smart pointers. BTW, some subtle differences in terminologies: In C++, a reference is really like an alias (ironically, not limited to reference types), whereas in other reference semantics languages like Java and C#, the term "reference" really refers to a pointer, or a safe version of a pointer managed by the system. You cannot really directly access the memory location using these references (unlike C pointers), but they are nonetheless pointers conceptually. A reference ("&") in C++ is a fundamentally different concept. C++ supports both types of references (pointer/reference and &-reference). The same with C#, to a certain extent. In C#, you can pass arguments "by reference", for instance, just like in C++.)
In Java, programming style of primarily using value types is impossible. The language simply lacks the facility to do that. (Just to be clear, however, I haven't been following recent changes in the Java programming language. So, things might be a bit different now in this regards although I highly doubt it.) In C#, programming style using value semantics is hard. Very hard. (For various reasons, which I don't want to get into in this post.) Regardless of which style is better, you are limited in those languages in terms of your choice. I am not sure how easy it is to do functional programming with Java, for instance, but the language is not designed for such programming style, and hence we can safely assume that it won't be easy (again, with the caveat that my knowledge of Java is a bit outdated). It's rather hard to do true OOP using javascript, as another example, since the language is not created for such a programming style in the first place.
C++ supports both paradigms, reference semantics and value semantics programming styles. You can choose, one or the other, or anywhere in between. As stated, although C++ was primarily used with value types, you can also use it primarily, or even exclusively, with reference types. You can use C++ almost like Java or C#. (I say "almost" because doing so in C++, especially in an "old" style (say, pre-C++11), is generally not "safe" unlike in Java or C#.)
But, why would you?
In my view, reference type based programming style is messy at best. The fact that you don't realize this is probably because you are so used to such programming styles. Virtually all modern programming languages encourage this style of programming as if it is a good thing. (BTW, the design of this type of languages like Java, C#, Python, etc. is all largely influeced by the concern for resource leaks. When you create a reference type object, you don't have to worry about memory leak, etc. in these languages. Garbage collection, for instance, takes care of object cleanup on behalf of the developers. This can be a good or bad thing depending on your perspective. I think it's mostly a bad thing. Imagine (and, try visualizing) a heap littered with dirty objects. The heap is really a "garbage can", literally. Developers throw in all these garbages into the heap without thinking, and they do not (and, cannot) clean it up. Do you still think it's a good programming style? BTW, just to be clear, I am not claiming that you can program in C++ without using heap. Far from it. But, in C++, the resources are managed via "scoping", a fundamentally different way from those used by other modern "safe" languages. A much much cleaner way, if you ask me.)
Modern C++ now has many language facilities that help developers deal with resource management, etc. and you could use a primarily reference semantics based programming style. You could. But, still why would you? Unless you are doing a certain low level type programming where value semantics is not approriate, I think the primarily value type based programming style is fundamentally a "better" way.
And, C++ is probably the only language that allows you to do it.
For this reason alone, I think you should consider modern C++ as your next programming language.
(BTW, the butchered C++ like Microsoft's C++/CX is not a true C++ language in this regards despite the fact that its syntax is closely related to ISO C++. .Net is a fundamentally reference-based system, and the only way you can use any language on .Net/CLR, C++ or C# or Javascript or whatever, is to abide by the rules imposed by their primarily reference-based runtime. All CLR languages are essentially the same except for their superficial syntactic differences.)
If you don't know what I am talking about, or if you don't know what the "value semantics based programming" is (which by the way is my invention, because there is no standard term for it, as far as I know), then you should learn modern C++ just to understand what that is. :)
0 notes
harrysmemo · 8 years ago
Text
Columbo
December 25th, 2016 (Day 91): Merry Christmas!
One of my favorite TV shows I watched when I was growing up was "Columbo", a detective story.
Tumblr media
Maybe Christmas, the Grinch thought, doesn't come from a store. -Dr. Seuss
The fictional character, Lieutenant Columbo, played by Peter Falk, is a homicide detective at LAPD.
The interesting thing about Columbo is that the murders in the story are generally committed by the rich and famous, and powerful. Like, celebrities, politicians, and business moguls, etc. Many criminals in Columbo think that they are smart enough to fool the police. Many of them are over-confident, to the point of arrogance. Of course, the story does not usually end in their favor. I don't know if that is indeed true in the real world, but at least that's the theme in Columbo.
A humble police detective in an old clothes driving an old car beats these smart, rich, and powerful people, many of who think that they are above the law.
I cannot believe that it's been almost five decades since the first episode of Columbo aired. (Incidentally, the first episode was directed by Steven Spielberg.)
Anyways, you can watch the all seven seasons on Netflix until the end of this year. A few more days... Ready for binge watching? :)
Merry Christmas!
0 notes
harrysmemo · 8 years ago
Text
The Little Cookie Girl
December 24th, 2016 (Day 90): Girl Scout cookies and child labor, etc.
I was at a coffee shop a few days ago, reading, and minding my own business.
Then, I heard a small mumbling sound, and I looked up. There was a tiny little girl holding what appeared to be boxes of cookies. She was selling cookies. She seemed like 6 or 7 years old. Could have been a bit older. She could have been a bit small for her age. I don't know. In any case, she didn't look old enough to even cross a street without adult supervision.
And yet, she was there in a coffee shop selling cookies.
Tumblr media
Blessed are they that mourn: for they shall be comforted. -John 5:4
It's not uncommon to see little girls (more like, teenagers) sell cookies on the street, or sometimes from door to door, in the name of fundraising. I am not going to pass judgements. It is, in many ways, part of American culture. There are pros and cons for such practices. I can see both sides of arguments.
But, a child, a little child, selling cookies to strangers?
Is this really acceptable?
There was a news a few days ago, in which several people were arrested for keeping a small child (3 years old) inside a horrible bug-infested box: "I've seen people treat their animals better."
What's the difference? An adult, probably the girl's parent(s), was forcing her to sell cookies, which is essentially panhandling. Is this acceptable?
In some third world countries, children have to go out and work, in order to be able to eat. If you don't work, you cannot eat. People even sell their kids for money. Unfortunately, povery makes people do horrible things. I am not going to pass judgements. But, here in America?
When I saw the little girl selling cookies, at first I didn't know what to make of it. I was in a shock. I had never seen anything like that in my life. It took me a few minutes to realize what was happening, and when I decided I should talk to her parent or whoever was forcing her out on the street, like a circus monkey, it was too late. They were gone.
I remembered a story I read when I was little. It was such a sad story.
The Little Match Girl by Danish writer, Andersen.
God bless the little cookie girl...
0 notes
harrysmemo · 8 years ago
Text
Finally, VCPKG!
December 23rd, 2016 (Day 89): New Visual C++ package management system on Windows.
C++ is an "old" language with a long history. And, it has a lot of bagages. The modern C++ dialect (e.g., C++11 and C++14, often dubbed as, what else?, "Modern C++") has many improvements, but it still has a lot of problems as well. One major issue, in my view, is that it has no unified package management system.
All modern programming languages have some type of package management system. Python has PIP. Ruby has Gem. C# (and, .Net in general) has Nuget. Virtually every language you use, or know of, these days, Java, Node.js, you name it, have central package management systems. Even Perl has CPAN (which has been around probably longer than any programming languages but few).
C++ does not have one.
Tumblr media
We cannot solve our problems with the same thinking we used when we created them. -Albert Einstein
C++ is a truly cross-platform language. (All modern languages run on a single platform/runtime, strictly speaking, which is ported to different operating systems, etc. This is a bit different from the case with C or C++.) This in some sense completely precludes a possibility of having a single unitfied package management system (although not impossible). The binaries built for Windows, for instance, are not compatible with those built for Mac.
This lack of package management system, I think, is a major problem for wider spead use of C++. How much time are we, collectively, wasting building the same software again and again (e.g., by different developers)? How much time are we wasting figuring out which libraries depend on which libraries, etc.? It's completely unncessary (most of the time), and it's just a wasted time and resources, ultimately increasing the cost of developing C++-based software.
To be fair, C++ has package management systems, sort of, different ones on different platforms. If you use Mac, then you will probably use Homebrew. If you use Linux (say, Debian variant), then you will probably use apt-get, etc.
If you use Windows, then ... huh? What?
Windows has been the worst platform when it comes to developing C++ software. It's ironic. Microsoft has been a pioneer in C++ dev tools. I remember when I started using "Microsoft C 1.0" decades ago, which subsequently became Visual C++. I used MFC, arguably the best C++ toolkit/library at the time, for many years. Thousands of developers still use Visual C++. I use Visual Studio every day. And yet, there was no good package management system for C++ on Windows.
Recently, Microsoft tried estending the nuget support to C++, which was originally created for .Net platform. I created some C++ nuget packages myself. But, support for creating C++ nuget packages is virtually non-existent, and nuget still lacks wide-spread adoption from the VC++ community for various reasons.
Then, I just learned that Visual Studio teams started a new initiative for C++ package management, called vcpkg, a few months ago. It's targeted to Visual C++ on Windows only at this point, but I see a potential that the system can become a truely cross-platform package management system some day, depending on how committed Microsoft is. (BTW, tools like homebrew and apt-get, and chocolatey on windows, are general purpose software management systems, used to install/manage system software and applications, etc. They are not really developer tools, if you are picky.)
Currently, many widely used software libraries have already been "ported" to vcpkg, and I see no reason not to give it a try at this point if you are a C++ developer on Windows. (Or, if you do cross-platform development using cmake on windows.) Obviously, in some sense, it competes with nuget, and there is a risk. It may be "dead" before we know it. But, alternatively, if you think about it, there is room for two package management systems. Nuget is to be exclusively used for .Net and .Net Core, and for native software development using C++, we may end up using vcpkg (possibly across differnt platforms). We will see.
There is always a cost in living on the bleeding edge, but if you are brave enough, then you can start using vcpkg by cloning the GitHub repo:
Microsoft/vcpkg
Once you build the vcpkg executable following the simple instruction on the GitHub page, try
.\vcpkg help
.
to get started.
0 notes
harrysmemo · 8 years ago
Text
Visual Studio Tool for CMake (VS 2017 RC)
December 22nd, 2016 (Day 88): Have you tried Visual Studio 2017 yet?
Upcoming Visual Studio 2017 release includes an interesting feature where you can build and remotely debug Linux build from Visual Studio on Windows. I haven't tried it yet (and, I am not entirely sure if that is really what I think it is), but considering Microsoft's recent strategy to make Windows a universal dev platform, I wouldn't be surprised if that is the case. I'll give it a try some day. Maybe.
What's more interesting is, VS2017 now includes a support for cmake-based development (for C++).
Tumblr media
The true sign of intelligence is not knowledge but imagination. -Albert Einstein
In case you don't know what it is, cmake is a cross-platform build system generation tool. For example, from a single cmake project file, you can generate makefiles for unix and Visual Studio solution files for Windows, etc. CMake currently supports all major build tools and platforms including cygwin on Windows and xCode on Mac, etc.
CMake-based development is a two-step process. You generate a set of build files first (for the target platform), and then use the generated build files to configure, compile, install your software. This is not the most convenient dev process, especially in the initial phase of a project where things are in constant flux.
The new Visual Studio tool for CMake allows to generate build files and to build the code within the IDE (VS2017) on Windows, thereby hiding this two step process from the developers.
Is it a big deal? I think so.
Now you can directly open a cmake-based code in Visual Studio (without VS solution or project files) and start developing as if it was a visual studio project. All the conveniences of VS, such as IntelliSense, etc. come with this setup. What's more, you are not tied to targeting Windows only. The software you build can be cross-platform (at least, in terms of build and configuration management). And, you do all this within Visual Studio, arguably the best IDE there is.
C++, despite many improvements in recent years, is not the best, or the easiest, programming language for development, in many respects.
Built-in support for cmake in Visual Studio is, I think, clearly a big step in the right direction.
Here's what you have to do, if you want to try it.
Install Visual Studio 2017 RC, and select the C++ Desktop app development feature. (Make sure that the VS Tool for CMake option is checked.)
Open a folder containing cmake-based project(s).
Use the menu to generate cmake cache, build projects, and run and debug the apps, and so forth. Note that you can use CMakeSettings.json, in addition to CMakeLists.txt to set build variables, etc.
That's all there is to it. Check out the series of recent VC++ blog posts if you are interested in more detail.
0 notes
harrysmemo · 8 years ago
Text
“The Earth is Flat”
December 21st, 2016 (Day 87): Back to blogging and ranting...
One thing I can never figure out is how to communicate with strongly opinionated, and narrow-minded, people. How do you reason with people who believe that the earth is flat, for instance?
A lot of times, there seems to be strong correlation between narrow-mindedness and incompetence, inexperience, and/or lack of intelligence in general. How do you convince people differently who believe that the earth is flat?
Tumblr media
"And yet it moves" -Galileo Galilei
I am often advised that I should just "ignore" them. As they say, don't argue with fools. Right. But, unfortunately, there are times you cannot ignore them. For instance, when making decisions which can potentially affect me or the people around me, etc.
As the story goes, when Galileo is persecuted for his scientific discovery (e.g., the earth moves around the Sun, not the other way around as many people thought at the time based on their religious belief), he recants his claim in the open court but later says that "[the earth] does move". Even the great mind like Galileo couldn't come up with a better way to cope with narrow-minded people.
I am not an eloquent person. In fact, far from it. But, still, I have no problem in communicating with reasonable people and reaching a concensus, etc. We don't have to agree on everything. We understand each other, and move on.
But, I am having unusually hard time working, interacting, with narrow-minded people with strong opinions. I don't know what to do with these people who have extremely strong belief that the earth is the center of the universe, for instance. The problem is, they don't think that they are narrow-minded. There is not even a room for doubt in their mind. How can you think even for a second that the earth created by God for his children is not the center of the universe?
How do you deal with such people?
What makes it unusually hard for me is that I am almost the opposite of strongly opinionated people. I'm like having no opinions, most of the time. I think that makes it much harder for me to communicate, and interact, with people with strong opinions.
Let's suppose that there are two alternatives A and B. It can be anything. It can be fact-related, or it can just be subjective views. Suppose that I think it's likely A than B. Say, 70% A and 30% B (although we don't really think like this, in terms of likelihoood). Now I meet a person who strongly believes that it is B. He has no doubt. How do you discuss this matter with this person and reach a reasonable conclusion? If you think about it, there is no way. He believes that it's B. I cannot argue that it's not B because, in my mind, there is always a certain possibility that it could be B. On the flip side, I cannot strongly argue that it is A since I have doubt in my mind and I am not entirely sure if it is A. There is no room for discussion and there is no compromise. It has to be B. How do you convince this person otherwise that it may not be B, without having strong conviction that it is not B.
I don't know...
Again, what makes this type of situation very difficult, at least, for me to deal with is that most strongly opiniated people often try to impose their strong views and ideas on other people. I cannot stand it. What's even worse, they are often wrong. They just insist that the earth is flat because they don't know any better. And, there is no easy way to convince these people otherwse.
They just insist and insist and insist, and I don't know what to do....
0 notes
harrysmemo · 8 years ago
Text
Installing OpenCV3 on Bash on Windows
October 28th, 2016 (Day 33): Unfortunately, Bash on Windows is not ready for primetime (yet).
I mentioned that I was rather happy with Bash on Windows in an earlier post, Test Driving Bash on Windows. I have no problem using it as a simple shell (as a replacement of Windows command window) for day-to-day tasks. But, as it turns out, there are a lot of limitations. I just tried to build and run OpenCV on Bash on Windows, and it didn't work.
error while loading shared libraries: libopencv_core.so.3.1: cannot enable executable stack as shared object requires: Invalid argument
.
There are definitely problems. This specific problem is not due to system call issues, but I hear that only half of the syscalls have been implemented on Bash on Windows so far, implying that a lot of Linux programs will not properly run on Bash on Windows (regardless of how accurate that information is).
In any case, that's the price you pay for living on a bleeding edge.
Tumblr media
We must accept finite disappointment, but never lose infinite hope. -Martin Luther King, Jr.
If you still want to try and build OpenCV on Bash on Windows, the steps are exactly the same as when you build OpenCV on Ubuntu 14.04 LTS (or, more or less on any Linux system).
First, install all build tools and necessary libraries:
sudo apt-get install -y build-essential cmake sudo apt-get install -y zlib1g-dev libjpeg-dev libwebp-dev libpng-dev libtiff5-dev libjasper-dev libopenexr-dev libgdal-dev sudo apt-get install -y libdc1394-22-dev libavcodec-dev libavformat-dev libswscale-dev libtheora-dev libvorbis-dev libxvidcore-dev libx264-dev yasm libopencore-amrnb-dev libopencore-amrwb-dev libv4l-dev libxine2-dev sudo apt-get install -y libtbb-dev libeigen3-dev
.
Next, clone OpenCV from the official GitHub repo.
git clone https://github.com/opencv/opencv.git
.
The most recent release was v3.1, and the current codebase is based on v3.1, obviously. If you want to build v3.1, then you can download the v3.1 archive from https://github.com/Itseez/opencv/archive/3.1.0.zip. Or, you can just use the "3.1.0" tag:
git checkout 3.1.0
.
Then, run cmake. This is often done in a "build" directory:
mkdir build cd build cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D WITH_TBB=ON -D WITH_V4L=ON -D WITH_FFMPEG=OFF -D WITH_OPENGL=ON -D BUILD_EXAMPLES=ON ..
.
where the ".." indicates that the build dir has been created within the repo root folder in this example. You can use different set of flags depending on your needs. For example, if you plan to use Qt, then you can set -D WITH_QT=ON, etc.
Next, you can build OpenCV using the generated makefiles:
make -j4 sudo make install
.
Build outputs, libs and include files, are installed under CMAKE_INSTALL_PREFIX. Include it in your library path, and configure it. For example,
sudo /bin/bash -c 'echo "/usr/local/lib" > /etc/ld.so.conf.d/opencv.conf' sudo ldconfig
.
That's it. But, when you try to run a sample code, you will realize the build wasn't actually complete, as mentioned in the beginning:
harry@MAUI:/..$ ./cpp-example-facedetect ../data/lena.jpg ./cpp-example-facedetect: error while loading shared libraries: libopencv_core.so.3.1: cannot enable executable stack as shared object requires: Invalid argument
.
This error appears to be fixable, using execstack
sudo apt-get install prelink sudeo execstack -c /usr/local/lib/*opencv*.so*
.
However, you cannot still run these examples because most sample codes require GUI interface, which Bash on Windows has no support for.
Well, it's the first step. Hopefully, we can run all Linux programs on Windows some day.
0 notes