Skip to content

Hackanooga

Over the past weekend (September 14 -17 2012) I had the opportunity of attending a hack-a-thon in Chattanooga, Tenesse duly named “Hackanooga“. The event spanned 3 days and brought developers from across North America to utilize Chattanooga’s 1 gbps network.  Various teams were formed (~12 I think) based on interests and technical strengths and the hacking began shortly after. It was an amazing hacking filled weekend, full of great food, awesome people, and great demos from every team. It’s worth mentioning how well organized the event was as everything went off flawless, the interruptions were kept to a minimum and everyone tried to stick to the “less yack, more hack” motto.  In terms of what I got done over the weekend, I had the opportunity to work on a few different things, ranging from writing a small patch for Andor Salga’s point-stream project, integrating Popcorn Maker into the Big Blue Button project, and helped Alan Kligman get his demo ready using Popcorn.js (The GPS data seems to be a tad off, but it’s still really cool!).

The patch I wrote for Andor, solved an issue he was having with limiting the amount of point-cloud data he was pushing across his node.js server. He wanted to be able to allow the client to change the amount of data that they wanted (which would in turn provide a clearer image) if they were on a slow connection. Being able to specify the amount of data, the user would be able to experience the project even if they weren’t on a fast network and the site would still seem responsive, instead of lagging in an attempt to keep up with the amount of data being received.  The patch essentially took only a percentage of each point-cloud frame (which was randomized, as the data is originally stored from the top of the image down) and pushed it to the client. I set up a small slider on the client side to let the user control the amount of data, which was updating a server variable via socket.io. Below you can see a small screencast of it working and you can find the

(https://github.com/dseif/bigbluebutton).

In addition to Andor’s project, I mainly focused on integrating Popcorn Maker into the Big Blue Button project, which took up the majority of the weekend. I had the chance to speak with Fred Dixon of Big Blue Button before I went down to Chattanooga and he noticed my experience with Popcorn.js and Popcorn Maker. He asked me if I would be interested in working with him and his team in order to allow their clients to remix published meetings from BBB (Big Blue Button). I began by modifying the BBB exported project template to include necessary attributes and scripts in order to work on Popcorn Maker. In favor of having the Popcorn Maker template not effect existing projects, I created a second template, remix.html, that would live along side the current template and only be used when editing the meeting using Popcorn Maker. Another hurdle that I had to work with was the fact that all of the exported meetings had their events stored in xml files and had no usable plugins for Popcorn.js (they were all using a code plugin, which runs an arbitrary piece of code at a given time). I went back and refactored these pieces into reusable plugins, but still had an issue interacting with the xml data, as Popcorn Maker doesn’t currently have a way to handle the data in that format. In an effort to get things working quickly (as we only had ~36 hours in total!) I produced a dirty hack that would XHR the data when parsing a saved-data.json file. I added a property into the saved-data.json to notify Popcorn Maker that we needed to parse and XML file for this project. Another issue was that each projects XML data was stored in a different folder location on the server, so I had to modify Popcorn Maker to take a meeting id as a query-string parameter and find the appropriate data on their server. In addition to this, logging in a saving using our node.js server was a headache.

The main reason interacting with our node.js server was a problem, was that the big blue button server served up files on a different port and location then our node.js server. This meant that whenever I tried to log in, I would get error letting me know that the domains didn’t match because of a port mismatch, and therefore couldn’t access anything on the Popcorn Maker node.js server. I fought for a long time with this, but eventually, around 3am, figured out that I needed to alter the response headers to accept cross origin requests and so on. On sunday morning I was in pretty good shape and managed to modify our embed files for Popcorn Maker to work with the BBB template. The exported page used the same controls we currently use in Popcorn Maker, overlaid over the published meeting HTML that the user modified. In addition to the two custom plugins (chat and slide), various other Popcorn Maker were kept in such as webpage, image, and Google Maps. The project seemed to be a pretty big success, as the Fred and the BBB team have plans to integrate it into their project. Below is a short screen cast of what I created.

About an hour before we were going to present our demos, Alan Kligman approached me and asked if I wanted to hack together a demo with him using Popcorn.js.  After Alan explained what their project was and what they were looking for (they stored various geolocation data while a user took video of their phone, which was then uploaded and played back with Popcorn.js) we began to hack something togethor. Since Alan was using so much geolocation data (which was time coded) we decided to plot Alan’s location from the video on a map as he walked along a street. Alan pointed me at all of the information I needed and organized it into a few usable scripts and we were off. It’s funny because the hardest, and most time consuming part of this whole hack, was attempting to get elements position correctly on a page using CSS. Go figure that the biggest bottle neck for two programmers was CSS 😐 Take a look at the screencast of the demo below.

The weekend overall was a success, which is evident from the attention it got from the media (close to 100 people showed up for the demos I believe), the mayor of Chattanooga, and it even got featured on the news. It was loads of fun and a lot of cool projects and demo were produced from everyones hard work over the weekend. If you haven’t been to Chattanooga before, I recommend checking it out, as the town is absolutely beautiful and the people there are amazing! Here are some pictures from the weekend that I took:

Popcorn Maker – Mobile Edition

About 2 weeks ago Robert Stanica and I signed up for a presentation at CDOT, as all of the various teams here present weekly on new and exciting things that they have been working on. Seeing as we haven’t presented and it was about 2 months into the summer I figured it was a good idea we get something out there. Initially I figured we would present on what we have been working on thus far, mostly CSS work from Rob and various JavaScript bug fixes from me. Over the past year I’ve done similar presentations and wanted to go for something different this time, so I proposed to Rob that we get Popcorn Maker working on the iPad ( this was Friday June 15th when I came up with this idea ). Rob jumped at the idea and was as stoked as I was to get working on it. Sadly because of how busy we have all been with Popcorn Maker and getting 0.5.2 out the door, we didn’t get a chance to work on this until Tuesday of this week and even that was a half day at best. The real work began on Wednesday.

The morning of Wednesday June 20th we split up the work as best we could and it made sense that Rob would tackle the CSS issues and I would dive into the JavaScript.  At this point Popcorn Maker actually didn’t load at all, and we were looking at an empty template, which is a pretty rough place to start. I had a good idea that because we were running on a mobile device we weren’t able to begin preloading the video on page load, so we never actually got any information about the video and in turn couldn’t load any of our UI ( our timeline is built and sized according to the duration of the video ). To get around this Rob designed a snazzy looking splash screen that would provide the user with a message and a start button. I created a button that when pressed would quickly play and pause the video ( triggering a load ) and the UI would then be built. The video loading has to be initiated by a users action as this stops sites from eating up users bandwidth by playing media in the background. My next main task was to hook up some touch events, which I had no experience with.

Initially I looked into some JavaScript libraries to handle as much of the touch event work as possible as I figured it would be quite messy. I found hammer.js and it looked pretty promising; I had access to dragging, swiping, gestures, and clicks. I began trying to use the library but found it to actually be more a nuisance then a help. There were a few issues here, one being that I couldn’t listen for touch events on the window with hammer, couldn’t destroy any created listeners, and found trying to debug with hammer to be an overall pain. If I had more time this might have been a different story or if I could of asked someone who has used it a few questions, but we had less than a day at this point so off I went to write my own stuff. This essentially entailed handling various listeners for `touchstart`, `touchmove`, `touchend`, `gesturechange`, and a few more. Basically touch start, move, and end mapped pretty close to mouse down, move, and up. I had to implement some hacky workarounds for figuring out the mouse position from touch events and properly handle multiple touches it they existed. Midway through the day I tried to make our scroll/zoom bars work with touch events but it turned out to be a giant worm hole and abandoned it after a few hours of trying in order to get more important features in. I was also able to implement a pinching gesture for resizing a trackevent. I went for the pinch instead of using our trackevent handles because I figured it would make the app feel more native on the iPad. By about 7pm I had all our touch events working ( at least the ones we were going to showcase in the presentation, by no means were they all implemented ).

One big blocker that we ran into was dragging and dropping new trackevents from the tray to the timeline. It’s funny because we actually almost decided to run without the ability to create new ones for the presentation but decided to implement a hack to create a new one on click at the last minute. This meant that you didn’t have to drag and drop to create the event, but could just click on its title and it would create an event at the videos currentTime with sane defaults. This wasn’t a future proof solution but it at least made the app some what feature complete.

At about 11pm we had saving, exporting, and logging working and rounded off all of the sharp edges that we created in the process ( its FULL of bugs ). We made a bookmark to the iPads homescreen so it felt more like a native app and removed some of the browsers UI which was nice. In the end I am really proud of what Rob and I did in <24 hours, we worked great as a team as we each brought our own set of skills to the table and worked in parallel most of the day and a half. I remember hearing a while ago that a good group/team member is someone you agree with about 70% of the time and Rob seems to fall into that category. It’s great to have someone there to tell you “Man that looks crappy” and be able to take what I’ve been working on and add some CSS polish to it. I think Rob and I make a great team and I’m looking forward to seeing what we can create.

Our presentation went pretty well and I think everyone liked seeing Popcorn Maker run on mobile devices, it’s something that we’ve wanted for a while. After the presentation Rob and I did a bit more work and made trackevent resizing and dragging REALLY smooth by adding gpu acceleration ( Rob blogged about this ). We also added gpu acceleration to track movement as well, so dragging a trackevent outside of the current viewing area doesn’t cause the screen to tear anymore.

Moving forward, Rob and I now have to go ahead and file some bugs about everything that we’ve noticed and get some feedback on ideas we have about how to restructure the UI. In the end I think this was a great experience and was a great break from the rush to get Popcorn Maker 0.5.1 and 0.5.2 out the door and ready for Mozilla’s story camp.

Beautifying using JSON.stringify and useCapture

This week I’ve been getting back into the swing of things at work after having a week off due to an injury.  Once I got back I began working on two small bugs in Popcorn Maker, bug #519 and bug #619.  Both of these bugs involved 1 or 2 line fixes, but were quite interesting in the fact that I learned a lot about two JavaScript methods that I use almost regularly, JSON.stringify and addEventListener.

Bug #519 was about making sure that trackevent handles ( used to resize a trackevent ) made the trackevent gain focus. Currently, clicking on a trackevent would give it focus, but clicking on the handles ( which are a child element of the trackevent ) would not.  After digging into the code a bit nothing jumped out a me that was glaringly wrong, so I figured it was an event bubbling issue.  I remember that the third arguement of addEventListener ( the one everyone forgets about ) had something to do with event bubbling so I went and looked it up on MDN.  The third arguement, also called userCapture, basically specifies if you want events to bubble as they normally do, from a child element up to it’s parent ( which means it would have a value of false, which I’m sure everyone is used to seeing ) or create what is called an event reflow ( I think ) and change the way an event gets triggered. By setting userCapture to true I noticed that events now fire from the parent element first and then to each of it’s children ( pending they have an event listener for the given event ).  This seemed to do the trick and the trackevent handles were now recognizing a click event and everything was being handled as it should.  The fix that I just explained worked because the event must have been getting captured and thrown away somewhere before it bubbled up to the parent element.  Doing what I did allows us to make sure that the event gives priority to it’s parent first ensuring that it get’s fired there and then fires the event on each of it’s children.  It was pretty cool playing around with how userCapture works and the difference it makes on events that get fired.  I don’t see setting userCapture to true to be something I will use very often, as an event that bubbles from a child up is typically what you want, but knowing how to change the flow and use it accordingly is definitely a powerful tool to remember we have in our arsenal as coders.

The other neat thing I learned this week was that I could use JSON.stringify to beautify JSON. Up until now I have only ever used JSON.stringify to do convert a JavaScript object into a string and never even bothered looking up what other arguements it took.  For bug #619 I wrote a recursive JSON beautifying function to do what I needed and put it up for review.  Bobby Ricther then asked me why I didn’t use something that was built in to do this already, like JSON.stringify.  I had no idea what he was talking about so I looked it up and was pretty amazed at what I found out.  I learned that JSON.stringify actually takes 3 arguements, the first being the source object, the second is a replacer function which lets you customize how you handle various data types, and the third being a spacer which lets you specify how the JSON is printed out. My fix went from a 25 line function to a 1  line fix once I found this out, which looked like the following:

var beautifiedObject = JSON.stringify( obj, null, 2 );

And that was it, I have beautified JavaScript in 24 less lines of code.  This is one of those things that you have to do the hard before you can learn the easy way and how awesome it is. Unless someone told me, I probably would have went on writing my own JSON beautifier functions until the end of time, so thanks for sharing that knowledge Bobby!

This week in general has been great for getting back into the swing of things again and refreshing myself with Popcorn Maker again, it’s crazy how fast the project is moving now a days. You don’t realize the crazy amount of bug mail you get until you take a week off and let it pile up.  Also with a project moving as fast as Popcorn Maker it’s also hard to stay up to date after taking that much time off, the code is changing so much everyday it’s crazy.  I’m looking forward to next week and getting ready to release 0.5 of Popcorn Maker!

Teaching is hard

This week was the start of a new summer semester at CDOT ( Centre for Development of Open Technology ) and each of the various projects had new hires. Robert Stanica, a fifth semester student in the BSD program at Seneca joined our team and will be primiarily focusing on getting Butter to work on mobile devices. This means phones and tablets running various OS’, such as Android, iOS, and even Mozilla’s new Boot to Gecko. Having someone focused on mobile development will be great as we’ve definitely been lacking on mobile front. Robert, however, is pretty new to JavaScript so he’s been spending the past few days reading and researching how it works. He’s been going through “JavaScript: The good parts” by Douglas Crockford as well as doing Code Academy. Even tho these are great resources for learning JavaScript, Robert inevitably still had questions so I’ve been doing my best to answer them.

Now I’m no teacher, not by a long shot. I tend to stumble over my words when trying to explain things, it ends up coming out as a jumbled mess that I have to piece together again after. Despite this I’ve been doing my best to explain concepts to Rob such as the difference between == and ===, how scoping works in JavaScript, objects, and closures. One thing that I’ve noticed doing this over the past few day’s is how hard it can be to concisely explain something and it has really got me thinking how under appreciated a good teacher can be. One of my bigger pitfalls when explaining concepts, as I explained before, is how I tend to stumble through an explanation. I jump back and forth between points making it difficult to understand what I’m trying to say. This is also evident in the way I give presentations. I tend to add lib a bit and whenever I do I tend to ramble on. I’m trying to make a conscious effort to slow down when explaining things and take a brief pause if needed to collect my thoughts. It’s difficult to do ( for me at least ) because I always attribute pauses of any sort to someone not knowing the content, when in reality I probably look worse trying to rush.

I think the funniest moment over the past few days has been when Rob wanted me to explain something from the book he was reading. There was an example about closures that had the following piece of code ( might not be exact but close enough ):

var i = 20;

console.log( i.toString( 16 ) );

At first Rob asked me what the arguments were that toString took. Without even thinking I’m pretty sure I blurted out “none” and didn’t think anything of it. “But it’s in the book, look”, I didn’t believe him at first, so I had to see for myself. I ended up looking at it dumbfounded for a while trying to figure out what was going on. My response was “Well, let’s see what it does” which I don’t know if I looked dumb for not knowing that toString took an aruguement or not, but regardless this was an opportunity for both of us to learn something together. I busted open my console and began playing with it and we soon realized that if a number was passed into toString it would convert the value to a base-X value. This meant passing in 16 for our value of 20 returned 14, and passing in 2 gave us 10100. It just goes to show that there is always an opportunity to learn something new and I think teaching is a perfect example of this.

Being forced to explain your ideas in more depth than normal can be really difficult and truly is an art. I did my best over the past week to help Rob and I hope some of what I was saying was useful. Teaching someone is really hard and we should appreciate the teachers that we’ve had that we’re good at explaining concepts and helping us ( everything always seems easy until you try and do it ). This was a great experience for me and I learned a lot about how do communicate more effectively and realized I’ve got some brushing up to do myself. Even tho I use most of these concepts daily, it was hard for me to explain things at times and teaching really forces you to know your content inside and out, as you will always get questions that you weren’t expecting. Being able to teach is an art that we all tend to take for granted and until you try and do it, never fully appreciate how hard, draining, and complicated it can be.

Hot Hacks!

Over the past weekend the CDOT crew and I were invited to participate in Hot Hacks, which is a two day long hack-a-thon centered around creating a web native portion to existing documentaries.  This meant the Popcorn team would be paired up with the various documentarians and we would help them create a working demo over the two days that could later be iterated on.  The 6 documentaries were as follows:

The Message: the (r)evolutionary power of climate change
“The Message is a multi-platform (book + documentary + web + events) project by Naomi Klein and Avi Lewis.

Turcot
“Turcot looks at Montreal’s largest highway Interchange, currently scheduled for a complete demolition and rebuild. The interactivity of the documentary project will allow the residents a direct voice, so that their concerns and opinions are heard by others. The voices will build on each other to develop alternative designs and solutions while there is still time to influence the actual Interchange construction.”

Following Wise Men (working title)
“The film will tell an evergreen story about a 50th anniversary reunion road trip with four friends who are astronomers; the interactive project will be an evergreen, searchable, expandable, community-sourced science web site charting astronomers and their discoveries in the context of their professors, mentors and students in an astronomer’s family tree. “

Looking at Los Sures “We will use an archival documentary (Los Sures by Diego Echeverria, 1984) about the South Williamsburg neighborhood as the inspiration as well as the primary online navigation for a set of new documentary projects that approach the same place and themes, now nearly thirty years later. New short interactive projects created by thirty different artists over three years will annotate and expand on the original film in ways previously unexplored.”

The Last Hijack
“For over 20 years Somalis have faced the horror of famine and war. The Last Hijack is a story about survival in this failed state. It is about the rise of piracy and how it affects the people around it.”

Immigrant Nation

Kate Hudson and I were paired up with Katie McKenna and began understanding what the project was about and what Katie wanted to accomplish.  The nice thing about this experience was the Brett and Ben had all of the teams create a concise idea before coming to Hot Hacks, so we had a good idea to begin with and rolled with it.  What Katie envisioned was a way to showcase the different perspectives of various speakers regarding our environment. To do this, we wanted to use a parallax effect and separate the project into different sections that the user could scroll through and experience each perspective independently with its own unique theme. Another issue that we had to address early on was the fact that Popcorn has a tendency to throw a lot of information at you very abruptly. To combat this we used various transitions and effects to blur content that wasn’t the primary focus at the moment and ease it in when it was relevant. In doing so we created an experience that allows the user to focus on what the speaker is saying but also annotate important information when it made sense.  Another unique aspect of our project was the fact that we utilized scrolling to drive the Popcorn experience throughout the demo. This meant that the user could freely explore the page with the opportunity to jump between perspectives seamlessly. This meant that the user could navigate as they saw fit and not feel like we were spoon feeding them a linear experience.

The experience in general was great. Katie, Kate, and I gelled quite well in my opinion and got on the same page pretty fast. Kate and really got into the groove of things on day two. We were able to both collaboratively work off of a single Github repo and merge with one another pretty painlessly. In typical hack-a-thon fashion, a lot of ideas were tried, didn’t necessarily work, and were thrown away, which was one of our groups strengths in my opinion. Since we weren’t afraid to try things we were able to create a tonne of small prototypes and were able to see what worked and what didn’t very fast. Some of the prototypes that we created and threw away were things like pulling in an RSS feed on relevant information and displaying it across the background of a page, a map of the world that lit up various regions as you scrolled through the page, and many more. It was awesome that Katie was on board with all of this as well, as I’m sure it had to seem quite hectic and scary at times!

All in all the weekend was great and the demo was a success. We created what we set out to do for the weekend and have a cool prototype for a web native version of The Message. I think events like this are an awesome idea as it bridges the gap between two very different professions. Just like Mozilla Journalism is bridging the gap between Journalists and Programmers, The Living Docs project and Popcorn is doing the same for Film Makers and Programmers and it’s great to be a part of.

Reflecting on OSD700

For the last 4 months I have been working on various Firefox bugs. A few of them have gotten landed, I’m still working on others and some were too hard to finish, so it’s been an interesting ride. The tickets that I’ve worked on have been the following:

  • Bug 702161 – videocontrols.xml has anonymous function event listeners that are added but never removed
  • Bug 680321 – Media preload state should reset in resource selection algorithm
  • Bug 708814 – Should fade out videocontrols even if there’s no mouse movement
  • Bug 686370 – Implement video playback statistics proposed in WHATWG
  • Bug 722788 – JSON number parsing is slower than it needs to be
  • Boot to Gecko working on my phone

Overall I’m happy with what I was able to accomplish so far and I’m still working towards finishing the video statistics ticket. Even tho my tickets haven’t been very high profile yet, it’s still a great feeling being able to get some code into Firefox, and really changes your perspective on how much hard work goes into making a browser.  I was working on tickets that involved things such as cleaning up event listeners, making sure controls for video elements got hidden when in fullscreen mode, and making sure the preload state reset properly, so I can only imagine how much work goes into other aspects such as security, the JavaScript engine, and so on. It really gave me a new respect for everyone doing this kind of work and I felt was a great experience.  It’s always funny when you hear someone say something along the lines of “Why doesn’t browser X just fix problem Y, it’s so easy” as once you get into the code it’s never easy and it’s never something that you can just write and then ship the next day, there is a ton of behind the scenes work that goes into making this a reality. Things like testing always go un-noticed by end users and is a really under appreciated portion of work that goes into development. I know from attempting to write my own tests on various projects that trying to prove that your fix is correct can sometimes be that hardest part of a patch. In addition to this I also continue to be amazed by the community and how helpful they can be.

Since I began working on open source projects over a year ago in OSD600 one of the biggest barriers to entry for me has been IRC. It always sounds simple to just go into IRC and ask for help but a lot of the time it can be a pretty nerve racking experience. It can be scary to just throw yourself into the middle of a chatroom full of sometimes hundreds of developers better than you and ask what is probably a trivial question. When I first began doing this I ended up being afraid almost every time, thinking I would be labeled as an idiot for asking such an easy question and from that point on that no one would take me seriously. In reality my experience has been quite the opposite, in that everyone is friendly and willing to help out a newcomer. I don’t know why this is, but I attribute it to the pay it forward mentality, someone helped you in the past so you are likely to do the same for a newcomer that comes to the channel looking for help.  I’ve actually taken this to heart and try to do it when I can, whether it be in Popcorn.js/Butter.js or Firefox bugs, I try to help people/newcomers whenever I can because I know I was once in the exact same position. I’m sure it’s just as hard to them to come into channel and ask a question as it was for me and I want to do whatever I can to make that experience more pleasant.

I have learned quite a bit this semester, from being more humble and stepping aside from my ego, to gaining confidence in both my coding and public speaking.  It’s taught me some interesting lessons when it comes to coding. Until this semester I have always tried to rush through things, brute forcing my way through code in fast trial and error type approach. I would figure out what the problem was and then begin frantically removing, altering, and adding code as I went along. Most of the time when I figured out what was wrong I was left with a warpath of broken and ugly code in my wake, something I’m sure my reviewers in Popcorn.js/Butter.js can attest to as I’m notrious for leaving things like console.log( “ASDASDASDASDASDFAS” ); in my code.  I learnt this lesson the hard way in one of the more recent bugs I was working on, 708814, which needed to be fixed up as it bit rotted for almost a month. I took a quick glance over my code and from what I could see it looked fine, I mean I also got an r+ on it, so it can’t be wrong, right? I went on what was a 20+ hour journey in the wrong direction, looking for a bug that didn’t exist.  It was a stressful experience to say the least, one where you just want to grab your computer, shake it violently, and yell “WHY AREN’T YOU WORKING”. I asked for help in channel and no one else was able to see a bug either. At this point I was spinning in circles in my chair and looking at the same 8 lines of code over and over again not understanding what could be wrong. Sort of jokingly I switched my listener from “mozfullscreenchange” to “mouseover” and changed the if condition to check if the video was fullscreen or not. This was basically the same approach as I was doing, just in a different fashion. To my surprise it worked, as it looked like a fullscreenchange event was being fired before a mouseover event, and this was why it wasn’t working. In the end it boiled down to me having 0 patience and rushing into the code. I didn’t even take a second look at my code and began smashing my face into other code and assumed to problem was there. If I took 10 minutes to go over my code and think about it again I probably wouldn’t have done this. The only real reason I figured this problem out was because I am stubborn and didn’t give up, but if I learned to be patient before hand I probably could have saved a day worth of work. In order to work on code as large as Firefox you are going to need to be able to persevere and be stubborn, the code is going to beat you down over and over again and you need to show it who’s boss. I’ve been pretty good at not giving up this semester, but I didn’t really learn how important it can be to take a step back and be patient with my approach until I saw how far off I went in the wrong direction with 708814 ( and it was really far ).

In general the semester was great, I got to write in a language that I am far from comfortable with, engage with an unfamiliar community, and land fixes in Firefox. I continually tried to challenge myself with the bugs I undertook and I learned a lot because of it. I’m working on getting the video statistics ticket up for review by thursday and then working on review failures from then on. I think being able to say “I wrote feature X for Firefox” is a pretty awesome and is definitely another notch on my programming belt, so that’s what I’m striving for.

I had a great semester with everyone from class and I can’t thank everyone enough for the great feedback during class presentations and the help on irc. It was also great to witness everyone else step it up a notch and see what they got done, it definitely motivated me throughout the course to keep going and not give up.  I wish everyone who graduated this semester ( I think everyone except me haha ) the best of luck with whatever you have planned for post graduation and hopefully we can all stay in touch via the seneca irc channel. To those who I’ll be working with over the summer at CDOT and Mozilla, I can’t wait to see what we make, it’s gonna be awesome!

Firefox Bug 708814

Over the last few days I have been working on attempting to finish Firefox Bug 708814. The bug is that when the video goes fullscreen the controls are not hidden when there is no mouse movement. I had a patch to fix this just over a month ago but I let it sit for too long before coming back to it and it no longer works.  Currently I’m trying to figure out why it is failing, which is proving to be quite difficult.

One of the new issues that is popping up is that Util’s is not defined from the following block of code. I briefly asked Jaws about it but he didn’t seem to know why this was happening either, so I am still investigating as to why this is happening. I also tried using |this| instead of Utils as they seemed to be the same thing, but I think the context of |this| was being miscontrued as well seeing as it was coming from inside a setTimeout.  After compiling and testing this again it wasn’t complaining about Utils not existing coming from my listener, but rather from when it was called inside onMouseMove. I did the same thing I did as before and wrapped the function reference in an anonymous function and called |self._hideControlsFn( self )| ( obviously gross code, but for testing purposes it was fine ). Finally I was seeing some progress! At this point the controls were hiding after a mousemove over the video, but not if there wasn’t one. This was good because I was now back to a state where everything was working the way it should be before my patch got applied, so I knew I was headed in the right direction.

The only thing that I figured could be the cause here was if for some reason the scubber was being dragged, as that was the only check being done inside of _hideControlsFn, so I figured I would make sure that we were getting inside there, which it turns out wasn’t the issue. Back to the drawing board. I decided to look deeper into what was going on, as the issue obviously wasn’t present in _hideControlsFn, so I went looking in the startFade function itself. After reading through the startFade function a bit it was pretty obvious where I needed to monitor, which was this block of code.  I throw some console.logs in there to check out what the values were and if for some reason anything weird was going on and it turns out that still looked pretty normal. The only difference that I could spot was that whenever the controls succesfully hid it would enter the last block inside of the else. I did some more logging before that block executed and got some interesting values. It turns out that the controls were not the only one calling startFade, but also an element called clickToPlay ( maybe the start button? ) so I decided to allow clickToPlay or controls to be the element for the condition to pass, but didn’t seem to make a difference. Looked like there was no option, I had to go deeper.

After playing around with some stuff for about an hour and realized that I missed something super obvious from the get go, that the listener for fullscreen had a check in it for hover, which always came up false initially, which really wasn’t what we wanted. I made it so the listener was for mouseover and checked to see if the video was in fullscreen or not, and what do you know it worked!  After looking at my fix a bit more this made sense ( I think ). I’m happy that I finally figured this bug out, as it was a super subtle and tricky one to figure out, I got led off track quite a bit and went down a ton of deadends. As we speak I’m creating a new patch and making sure that it works fine, so fingers crossed!

OSD700 0.9

For my 0.9 release I worked on a few different things, such as revisiting an old patch, more work on the video statistics bug, and even began fooling around with Boot to Gecko. It was an interesting 2 weeks as I experienced various ups and downs throughout the different things I was working, particularly Boot to Gecko ( B2G ). Initially I wanted to clean up the old code for 708814 and finally get it landed, tho it wasn’t as clear cut as it sounded.

Over the course of the semester I have been working on and off on bug 708814 and I even posted a patch a few weeks back. I got a “feedback looks good” from Jaws and thought I was essentially done, so I didn’t worry about it much.  I wanted to finally finish this ticket and get it landed so I could get it off my back.  I began by attempting to write some unit tests, but after about 2 days of driving myself nuts I realized that there really wasn’t a good way to test the feature that I added after Chris Pearce mentioned it on IRC. Awesome, I thought I was done at this point so I put the patch back up for a formal review and thought I had wiped my hands of this ticket, which sadly wasn’t the case.  Before I did this Chris asked me to re-base my code off of master as it had been sitting for a while and then put it up for review. I did this and threw it up, at which time Chris began playing with it.  It turns out that I few edge cases have turned up since I initially wrote the patch and the fix was no longer sound, which sucked pretty hard. I realized that through my neglect the code sat long enough that Mozilla Central had progressed far enough that my patch no longer fixed what it once did and that I needed to go back to the drawing board. It sucks because through my laziness I actually created more work for myself in the long run and should have just followed through with this patch from the get go, as this probably wouldn’t have happened. I am making this my first priority for my 1.0 release and want to get this done ASAP, even before I begin playing with the video statistics again.

The first ticket I was assigned to do this year was 686370, which was to finish implementing the video playback statistics.  I wrote about some of my earlier experiences with this and have been building on what I have been doing since. Mathew Schranz has also been helping me on this and has been writing about his experiences implementing the playback jitter portion of the stats.  There is still quite a bit left to do here and since this is the main feature I am planning on getting done for the end of the semester I am going to have my work cut out for me over the next few weeks.

The last but probably most interesting thing I have been working on is attempting to get Boot to Gecko working on my phone, the Samsung Galaxy SII. Currently my phone is running android and I thought it would be interesting to be able to play with the new mobile OS that Mozilla has been developing. Boot to Gecko ( or b2g for short ) is developing on top of core android libraries that control things such as the camera, wifi, and various other essential parts of the phone experience. On top of the Mozilla has made it so the web is the OS, meaning that everything is written in HTML, CSS, and JavaScript. This was really appealing to me as I work with the web on a daily basis, so being able to create apps without having to play with Java or objective C sounded pretty appealing. The only problem here was that the version of my phone wasn’t currently supported, so I knew I was in for a quite a bit of work. I wrote about some earlier experiences I had working with b2g, but it I never actually got it working which sort of sucks. The farthest I got into the build was getting to the point of flashing the kernel and having it fail at 98%, which really sucks. I did some more research and analyzed the logs a bit over the weekend and found that when I was creating the ROM it was actually failing in a few spots but I never noticed. I am going to research these issues in the future, but probably won’t be able to revisit my b2g work until after I have finished 708814 and 686370.

The next two weeks are going to be pretty busy and I am going to have to work like crazy in order to get these patches ready for the end of the semester, so it should be pretty interesting.

Installing Boot to Gecko – Part 2

Earlier this week I wrote about my initial experience with attempting to install Boot to Gecko on my Samsung Galaxy S II.  Previously I went through setting up the OS, installing the required dependencies, making a backup of your current installation in case something goes wrong ( which is pretty likely in my case ).  The installation for the most part went pretty smoothly until I ended up hitting a problem when I actually began building.

The problem that I initially ran into was build fails due to the fact that my specific model/version isn’t supported yet.  The solution here was to modify one of the build scripts to reflect my device, which wasn’t all that bad.  What I had to do was add in a case for my specific version and account for file names that may be different on my version compared to what is already used.  In a lot of the cases it just turned out to be modifying IF statements or appending an ‘M’ to some files ( as that is my specific version – the GT-I9100M, and the build – GINGERBREAD.UGKG2 ). After I had done this my make config-galaxy-s2 was passing now, which was awesome.

Now it was time to begin building gonk, which from what I understand is the backend of Boot to Gecko.  So I built that and it actually ended up going pretty smoothly.  I then simply ran |make| after that and this is when I began seeing some errors. On a side node it looks just like building Firefox from source, which is pretty cool. The errors were pretty cryptic and about stuff that I didn’t understand so I did what I normally do under these circumstances and went to ask someone smarter than me. I went into the IRC channel and started asking about the error and I immediatley got some responses. I was told to run |make sync| and to pastebin the output. What make sync does is update all of git submodules and makes sure everything is up to date with master.  I ran it and the output looked clean and fine to me, so I pastebined it to the channel. Someone quickly spotted where I went wrong. In my logs it said I that it was updating from my remote, so dseif/b2g ( which looked ok to me ) but apparently was wrong.  The problem here was that I forked and then cloned instead of just cloning.  The reason I shouldn’t have done this is it screws up the submodule updating apparently and I am not getting the most up to date files.  The errors I was seeing were old issues that had already been fixed and what I needed to do at this point was set my origin’s remote url as the url I forked from.  After I did this I re-ran make sync and I started getting updates! I went through the commands that I ran previously and they seemed to be passing now 🙂 The make command took a while to finish but it did so without any errors.

Now it was time to begin flashing! To do so all I had to do was run |make flash|.  This ended up taking a while and once it was done it rebooted my phone. The moment of truth came and it didn’t boot at all, which scared me a bit 😦 I went back and reflashed my device by running the following:

adb reboot recovery

and then selected my backup that I made earlier and reflash my phone. Luckily the backup worked just fine and my phone was back in a working state 🙂

Tonight I plan on trying to figure out why my flash was failing and I will hopefully have a running boot to gecko phone by the end of the night 😀

 

Installing Boot to Gecko – Part 1

For the past month or so Stefan Reese and I have been talking about various things mobile, from what’s new and cool on Android to the various devices coming to market.  Stefan is basically my source for everything mobile and keeps a fresh stream of info coming my way.  When I heard about Boot to Gecko I obviously had to show Stefan about it and he seemed pretty stoked about it.  Once Mozilla demoed B2G at the Mobile World Congress 2012 and I saw what it was really about I wanted it pretty bad. I’ve been playing with idea of installing it on my device ( luckily I have a Samsung Galaxy S II, which seems to be what most of the demos are on ) and when Stefan mentioned he was interested in trying it out as well, I figured I may as well do it.

First thing’s first, it seems to be recommended that you do the build on Linux ( the latest version of Ubuntu seems to be what most guides are recommending ) so that’s what I did.  Since I was already running Windows on my desktop I used Ubuntu’s Window’s installer, which allows you to install Ubuntu right alongside an already existing Windows install ( no reformatting/partitioning needed! ).  Once I installed Ubuntu I installed the usual things that I would on a fresh install ( flash, vim, git, got all my music up and running, set up thunderbird, ect ) and begin following the guides that I found online.  One of the great things about the Mozilla community is how well they document things. I was able to follow the steps for the most part with no hiccups. I followed these guides and they seemed to do the trick, so I also recommend them! Just and FYI, I accept no responsibility if you brick your phone by following my steps 😉

Setting up my enviroment so it was ready to install B2G was pretty simple.  It involved getting adb up and running ( which I had prior experience with from some Android development I’ve done ). You can do this by installing the Android SDK ( once installed you may need to run android-sdk-linux/tools/android and update the SDK ). Once that is done, you will need to add adb ( inside android-sdk-linux/platform-tools/ ) to your path ( done in your ~/.bashrc ).  You can test this easily by going to your terminal and typing adb, you should see something like the following:

Now I was ready to backup my phone, which required installind heimdall, which is a cross-platform tool used to flash firmware onto Samsung Galaxy S devices. After installing heimdall you are going to need make sure it has access to your phone ( which is done via udev ). I did this by editting /etc/udev/rules.d/android.rules ( if it doesn’t exist create it ) and adding the following line:

 SUBSYSTEM==”usb”, ATTRS{idVendor}==”04e8″, MODE=”0666″
After doing this I made sure the file was readable and I was ready to move onto the next step, which was installing a whole slew of build dependencies.
To install the build dependencies I essentially coped some commands that were listed on the wiki and it seemed to work out just fine for me.  The commands were as follows:
sudo apt-get build-dep firefox
sudo apt-get install git mercurial libasound2-dev libcurl4-openssl-dev libnotify-dev libxt-dev libiw-dev mesa-common-dev autoconf2.13
sudo apt-get install ia32-libs gcc-multilib g++-multilib bison flex gperf lib32z-dev lib32ncurses5-dev lib32ncursesw5-dev libidl-dev lib32gomp1 autoconf2.13 ccache libx11-dev lib32readline-gplv2-dev
sudo apt-get install default-jdk
Now it’s time to back up our phone and make sure if shit hits the fan that we have something to fall back on. Make sure your phone is connected to your computer via USB. To do this we are going to use heimdall ( which we installed earlier ).  First what I did was cd into my tmp directory. I then ran the following commands:
wget cmw.22aaf3.com/c1/recovery/recovery-clockwork-4.0.1.4-galaxys2.tar
tar xvf recovery-clockwork-*tar
Next I ran:
adb reboot download
Which puts the device into download mode.  You should know your phone is in download mode if your screen is now showing a green android character and telling you not to disconnect your device because it’s in download mode haha.  Now we are going to flash the phone by running the following:
heimdall flash --kernel zImage
And then boot our phone back into recovery mode:
adb reboot recovery
Follow the on screen directions to create a back up and so on ( use your volume buttons and home button to navigate ). Once you’ve done this back up your phones filesystem onto your computer by running the following:
mkdir sgs2-android
cd sgs2-android
adb pull /system system
adb pull /vendor vendor
My console looked like the following during the pull command:
and looked like the following after the vendor command:
Alright, finally we can clone our git repo! I forked mine from https://github.com/andreasgal/B2G ( which from what I can tell is the main one ) and then cloned into a local repository called B2G.  Once you’ve done this run the following commands ( NOT as root, it was strongly advised not to do so on the guide I used ):
cd B2G ( or wherever you have it as )
make sync ( from what I can tell it updates a ton of submodules that are in the project. Depending on how fast your connection is this may take a while, as it did for me. ).
export ANDROIDFS_DIR=sgs2-android ( should be exported to the same PATH that you updated earlier )
make config-galaxy-s2
Once I ran make config-galaxy-s2 I got some build errors mentioning that it couldn’t find a directory.  This was the extent of my night last yesterday and I am going to have to do some more googling or speak to someone in #b2g on irc.mozilla.org.  So far the only real problem that I have encountered was this, which is pretty good I guess. Since B2G is still so young it’s expected that I was going to run into a few issues, who knows, maybe I’ll even get to file a bug or two. Expect another post tonight with hopefully a running version of B2G!