I think a lot of people don’t even start learning to code, because they think they need a four year degree in computer science to be useful.
This is false. Five minutes of programming now can save you hours of mundane grunt work later. More »
I think a lot of people don’t even start learning to code, because they think they need a four year degree in computer science to be useful.
This is false. Five minutes of programming now can save you hours of mundane grunt work later. More »
When I enrolled as a freshman in college, I registered as a linguistics major but I had a notion that I would minor in computer science. Computer science seemed interesting and well-paying and I didn’t even know computational linguistics was a thing at the time. I just liked computers. I never had a problem switching between Macs and PCs. I liked to peak inside computers and replace the RAM and things like that. I had poked around with HTML editors. The classes on things like graphic design and artificial intelligence and stuff seemed really cool.
I looked up the prerequisites and found that to minor in CS you had to actually get pretty far in math, at least through Calculus C and one or two courses of Linear Algebra. So, naturally, I signed up for Calculus A my fall term.
When I built my computer, I had no problems at all. I mean, sure I was confused or puzzled by a few steps, and I was certainly challenged by the process, but I was never frustrated. All in all, it was a very empowering experience. I learned a lot and I was really successful.
And then I decided to upgrade. When I bought my graphics card, I made the mistake of not researching the computability of graphics card drivers with Ubuntu. It caused a whole weekend of headaches and troubleshooting and frustration before I got it working again.
So did I learn my lesson about doing research before making major changes to my computer? Apparently not.
I bought one of the best GPUs I could get for under $100. I determined this by using the benchmarks per dollar chart here and by watching Black Friday sales closely. I did not determine which GPU I should buy based on anything other than the above, and the games I wanted to play, and whether it fit my motherboard. Okay, so I did my research, I just didn’t take into account I’m running Linux and not Windows.
Big mistake.
I’m finally getting into GitHub, partially thanks to Coursera’s Data Science specialization, which requires it. Anyway, I blogged about my twitter bot, @AllTheLanguages, here and here, and now you can download, fork, watch, star, or whatever it is that kids do on GitHub to code here.
A few weeks ago, I posted about how to build a twitter bot. I wish it stopped there. Unfortunately, all code has bugs (ahem, I mean, features). There are two bugs in my code. The first one I understand, and could probably fix if I tried, but I haven’t because it’s probably more trouble than it’s worth. The second one I don’t understand, but I do know how to fix it. More »
I started reading a book about artificial intelligence. It’s an older book, but only $4 and it came highly recommended as a starting point, since a lot of the basic concepts are still the same. Based on the things I was reading and this xkcd, I figured it might be within my capabilities to write a program that plays Tic Tac Toe. And I have. Sort of. You can play it here. Kinda.
See, the concepts of artificial intelligence and the basics of programming aren’t so hard. What’s hard is making it work in the “real world.” More »
A few months ago, I created a bot on Twitter. @AllTheLanguages tweets a new language from the Ethnologue database once every hour or so, and will do so for about a year. Give or take. Sometimes the bot goes down and I have to reboot it. And there are some other bugs too. But more on that in another post…
When I tell people that I made a twitter bot, the first thing they ask (after “why?”) is “how?” Well, today, I’m going to answer that! Why? Because it was fun! How? Well, it’s complicated… More »
Boo is boolean, apparently.