Skip to content

OSD700 0.9

For my 0.9 release I worked on a few different things, such as revisiting an old patch, more work on the video statistics bug, and even began fooling around with Boot to Gecko. It was an interesting 2 weeks as I experienced various ups and downs throughout the different things I was working, particularly Boot to Gecko ( B2G ). Initially I wanted to clean up the old code for 708814 and finally get it landed, tho it wasn’t as clear cut as it sounded.

Over the course of the semester I have been working on and off on bug 708814 and I even posted a patch a few weeks back. I got a “feedback looks good” from Jaws and thought I was essentially done, so I didn’t worry about it much.  I wanted to finally finish this ticket and get it landed so I could get it off my back.  I began by attempting to write some unit tests, but after about 2 days of driving myself nuts I realized that there really wasn’t a good way to test the feature that I added after Chris Pearce mentioned it on IRC. Awesome, I thought I was done at this point so I put the patch back up for a formal review and thought I had wiped my hands of this ticket, which sadly wasn’t the case.  Before I did this Chris asked me to re-base my code off of master as it had been sitting for a while and then put it up for review. I did this and threw it up, at which time Chris began playing with it.  It turns out that I few edge cases have turned up since I initially wrote the patch and the fix was no longer sound, which sucked pretty hard. I realized that through my neglect the code sat long enough that Mozilla Central had progressed far enough that my patch no longer fixed what it once did and that I needed to go back to the drawing board. It sucks because through my laziness I actually created more work for myself in the long run and should have just followed through with this patch from the get go, as this probably wouldn’t have happened. I am making this my first priority for my 1.0 release and want to get this done ASAP, even before I begin playing with the video statistics again.

The first ticket I was assigned to do this year was 686370, which was to finish implementing the video playback statistics.  I wrote about some of my earlier experiences with this and have been building on what I have been doing since. Mathew Schranz has also been helping me on this and has been writing about his experiences implementing the playback jitter portion of the stats.  There is still quite a bit left to do here and since this is the main feature I am planning on getting done for the end of the semester I am going to have my work cut out for me over the next few weeks.

The last but probably most interesting thing I have been working on is attempting to get Boot to Gecko working on my phone, the Samsung Galaxy SII. Currently my phone is running android and I thought it would be interesting to be able to play with the new mobile OS that Mozilla has been developing. Boot to Gecko ( or b2g for short ) is developing on top of core android libraries that control things such as the camera, wifi, and various other essential parts of the phone experience. On top of the Mozilla has made it so the web is the OS, meaning that everything is written in HTML, CSS, and JavaScript. This was really appealing to me as I work with the web on a daily basis, so being able to create apps without having to play with Java or objective C sounded pretty appealing. The only problem here was that the version of my phone wasn’t currently supported, so I knew I was in for a quite a bit of work. I wrote about some earlier experiences I had working with b2g, but it I never actually got it working which sort of sucks. The farthest I got into the build was getting to the point of flashing the kernel and having it fail at 98%, which really sucks. I did some more research and analyzed the logs a bit over the weekend and found that when I was creating the ROM it was actually failing in a few spots but I never noticed. I am going to research these issues in the future, but probably won’t be able to revisit my b2g work until after I have finished 708814 and 686370.

The next two weeks are going to be pretty busy and I am going to have to work like crazy in order to get these patches ready for the end of the semester, so it should be pretty interesting.


Installing Boot to Gecko – Part 2

Earlier this week I wrote about my initial experience with attempting to install Boot to Gecko on my Samsung Galaxy S II.  Previously I went through setting up the OS, installing the required dependencies, making a backup of your current installation in case something goes wrong ( which is pretty likely in my case ).  The installation for the most part went pretty smoothly until I ended up hitting a problem when I actually began building.

The problem that I initially ran into was build fails due to the fact that my specific model/version isn’t supported yet.  The solution here was to modify one of the build scripts to reflect my device, which wasn’t all that bad.  What I had to do was add in a case for my specific version and account for file names that may be different on my version compared to what is already used.  In a lot of the cases it just turned out to be modifying IF statements or appending an ‘M’ to some files ( as that is my specific version – the GT-I9100M, and the build – GINGERBREAD.UGKG2 ). After I had done this my make config-galaxy-s2 was passing now, which was awesome.

Now it was time to begin building gonk, which from what I understand is the backend of Boot to Gecko.  So I built that and it actually ended up going pretty smoothly.  I then simply ran |make| after that and this is when I began seeing some errors. On a side node it looks just like building Firefox from source, which is pretty cool. The errors were pretty cryptic and about stuff that I didn’t understand so I did what I normally do under these circumstances and went to ask someone smarter than me. I went into the IRC channel and started asking about the error and I immediatley got some responses. I was told to run |make sync| and to pastebin the output. What make sync does is update all of git submodules and makes sure everything is up to date with master.  I ran it and the output looked clean and fine to me, so I pastebined it to the channel. Someone quickly spotted where I went wrong. In my logs it said I that it was updating from my remote, so dseif/b2g ( which looked ok to me ) but apparently was wrong.  The problem here was that I forked and then cloned instead of just cloning.  The reason I shouldn’t have done this is it screws up the submodule updating apparently and I am not getting the most up to date files.  The errors I was seeing were old issues that had already been fixed and what I needed to do at this point was set my origin’s remote url as the url I forked from.  After I did this I re-ran make sync and I started getting updates! I went through the commands that I ran previously and they seemed to be passing now 🙂 The make command took a while to finish but it did so without any errors.

Now it was time to begin flashing! To do so all I had to do was run |make flash|.  This ended up taking a while and once it was done it rebooted my phone. The moment of truth came and it didn’t boot at all, which scared me a bit 😦 I went back and reflashed my device by running the following:

adb reboot recovery

and then selected my backup that I made earlier and reflash my phone. Luckily the backup worked just fine and my phone was back in a working state 🙂

Tonight I plan on trying to figure out why my flash was failing and I will hopefully have a running boot to gecko phone by the end of the night 😀


Installing Boot to Gecko – Part 1

For the past month or so Stefan Reese and I have been talking about various things mobile, from what’s new and cool on Android to the various devices coming to market.  Stefan is basically my source for everything mobile and keeps a fresh stream of info coming my way.  When I heard about Boot to Gecko I obviously had to show Stefan about it and he seemed pretty stoked about it.  Once Mozilla demoed B2G at the Mobile World Congress 2012 and I saw what it was really about I wanted it pretty bad. I’ve been playing with idea of installing it on my device ( luckily I have a Samsung Galaxy S II, which seems to be what most of the demos are on ) and when Stefan mentioned he was interested in trying it out as well, I figured I may as well do it.

First thing’s first, it seems to be recommended that you do the build on Linux ( the latest version of Ubuntu seems to be what most guides are recommending ) so that’s what I did.  Since I was already running Windows on my desktop I used Ubuntu’s Window’s installer, which allows you to install Ubuntu right alongside an already existing Windows install ( no reformatting/partitioning needed! ).  Once I installed Ubuntu I installed the usual things that I would on a fresh install ( flash, vim, git, got all my music up and running, set up thunderbird, ect ) and begin following the guides that I found online.  One of the great things about the Mozilla community is how well they document things. I was able to follow the steps for the most part with no hiccups. I followed these guides and they seemed to do the trick, so I also recommend them! Just and FYI, I accept no responsibility if you brick your phone by following my steps 😉

Setting up my enviroment so it was ready to install B2G was pretty simple.  It involved getting adb up and running ( which I had prior experience with from some Android development I’ve done ). You can do this by installing the Android SDK ( once installed you may need to run android-sdk-linux/tools/android and update the SDK ). Once that is done, you will need to add adb ( inside android-sdk-linux/platform-tools/ ) to your path ( done in your ~/.bashrc ).  You can test this easily by going to your terminal and typing adb, you should see something like the following:

Now I was ready to backup my phone, which required installind heimdall, which is a cross-platform tool used to flash firmware onto Samsung Galaxy S devices. After installing heimdall you are going to need make sure it has access to your phone ( which is done via udev ). I did this by editting /etc/udev/rules.d/android.rules ( if it doesn’t exist create it ) and adding the following line:

 SUBSYSTEM==”usb”, ATTRS{idVendor}==”04e8″, MODE=”0666″
After doing this I made sure the file was readable and I was ready to move onto the next step, which was installing a whole slew of build dependencies.
To install the build dependencies I essentially coped some commands that were listed on the wiki and it seemed to work out just fine for me.  The commands were as follows:
sudo apt-get build-dep firefox
sudo apt-get install git mercurial libasound2-dev libcurl4-openssl-dev libnotify-dev libxt-dev libiw-dev mesa-common-dev autoconf2.13
sudo apt-get install ia32-libs gcc-multilib g++-multilib bison flex gperf lib32z-dev lib32ncurses5-dev lib32ncursesw5-dev libidl-dev lib32gomp1 autoconf2.13 ccache libx11-dev lib32readline-gplv2-dev
sudo apt-get install default-jdk
Now it’s time to back up our phone and make sure if shit hits the fan that we have something to fall back on. Make sure your phone is connected to your computer via USB. To do this we are going to use heimdall ( which we installed earlier ).  First what I did was cd into my tmp directory. I then ran the following commands:
tar xvf recovery-clockwork-*tar
Next I ran:
adb reboot download
Which puts the device into download mode.  You should know your phone is in download mode if your screen is now showing a green android character and telling you not to disconnect your device because it’s in download mode haha.  Now we are going to flash the phone by running the following:
heimdall flash --kernel zImage
And then boot our phone back into recovery mode:
adb reboot recovery
Follow the on screen directions to create a back up and so on ( use your volume buttons and home button to navigate ). Once you’ve done this back up your phones filesystem onto your computer by running the following:
mkdir sgs2-android
cd sgs2-android
adb pull /system system
adb pull /vendor vendor
My console looked like the following during the pull command:
and looked like the following after the vendor command:
Alright, finally we can clone our git repo! I forked mine from ( which from what I can tell is the main one ) and then cloned into a local repository called B2G.  Once you’ve done this run the following commands ( NOT as root, it was strongly advised not to do so on the guide I used ):
cd B2G ( or wherever you have it as )
make sync ( from what I can tell it updates a ton of submodules that are in the project. Depending on how fast your connection is this may take a while, as it did for me. ).
export ANDROIDFS_DIR=sgs2-android ( should be exported to the same PATH that you updated earlier )
make config-galaxy-s2
Once I ran make config-galaxy-s2 I got some build errors mentioning that it couldn’t find a directory.  This was the extent of my night last yesterday and I am going to have to do some more googling or speak to someone in #b2g on  So far the only real problem that I have encountered was this, which is pretty good I guess. Since B2G is still so young it’s expected that I was going to run into a few issues, who knows, maybe I’ll even get to file a bug or two. Expect another post tonight with hopefully a running version of B2G!

OSD700 0.8

This release I finally got time to sit down and focus on the first bug I was assigned this semester, which was 686370 ( Implement video playback statistics proposed in WHATWG ).  The playback statistics were already started earlier and a few portions of it have already been implemented ( mozParsedFrames, mozDecodedFrames, mozPresentedFrames, mozPaintedFrames, and mozPaintDelay ). What I was to do for this ticket was to finish off the remaining statistics that needed to be implemented ( scroll down to the bottom ). Mathew Schranz also expressed interest in helping with this so I’m sure you’ll catch him blogging about his experiences as well.

To get things started I was fortunate enough to have some class time allocated to go over how we would begin working on this bug.  Dave went over what would need to be done in order to get this implemented and how we would go from the decoder level all the way up to the DOM.  We began by taking a look at the already implemented portions of the spec and began tracing a route that they took from an idl file all the way down into the decoder. This was invaluable to me as I got input from the class on what they would do, thoughts from others who have worked with idl files before, as well as an understand of the scope of what I would be working on.  After we traced a path through the code it was time to begin coding.

I went home that night and basically didn’t know which statistics I wanted to do first, so I just chose the one at the top of the list, which was bytesReceived. bytesReceived is described as “The raw bytes received from the network for decode. Together with the downloadTime, this can be used to calculate the effective bandwidth. Includes container data.” which basically means that we need a way to measure all of the data that we are downloading for a given media element. I took a similar approach to what was done before in the  video idl file and applied it to the media element idl file.  I began forging my own path down to the decoder and began seeing some results. Once I got into the decoder I found a portion of the code that looked like what I needed to monitor and got access to the data I needed.  I battled with making this correct for the next few nights and it actually turned out that I wasn’t even in the right code, as I’m confident now the data that I need is hidden within the media cache somewhere. I took the weekend off to work on my car and think about this and it dawned on me that what I actually found and created was bytesDecoded.  I ended up renaming things accordingly and testing it again and it seemed like exactly what I needed.  1 down, 5 to go :).

Next I decided to work on dropped frames, as I scrambled my brain a bit trying to comprehend some of the media cache code. droppedFrames is defined as “The number of frames of video that have been dropped due to performance reasons. This does not include (for example) frames dropped due to seeking.”. This means that we need a way to measure any frames that are not displayed, which will provide developers with a way to measure the speed of the decoding pipeline ( at least a portion of it, more accurate results would be achieved is used in conjunction with other media statistics ). Since this particular statistic was referencing frames I knew it would be directly related video only, and we could ignore the audio portion of the media element.  This enabled me to do some caro culting and copy what I could from earlier implementation of the video statistics.  Once I reached the decoder I did some searching and read through the code a bit and ended up finding the following snippet in the nsBuiltinStateMachine.cpp and added in a small piece of code that would keep track of all of the frames that were skipped and report them to a setter that I created in nsMediaDecoder for mozDroppedFrames. The setter would simply increment a value that I had stored in the nsMediaDecoder and I also had a getter that would be able to reach in and get the value of droppedFrames.  It looked like a pretty solid first attempt and I rolled with it and all seemed to be working well.  The problem that I am having is finding out if I did it right, because I seem to be getting the same results for both high-def videos and low-def videos ( 1080 vs 480 ).  This doesn’t seem like it should be the case and I should probably look into this further.

The next steps for the statistics are to get the remaining few implemented and get this up for review as soon as possible. I’m sure that I’ve done a few things the wrong way, or it isn’t optimized enough, so I’m looking forward to getting some feedback.  Until then, expect a few more blog posts about this and expect these stats to be landing in one of the upcoming Firefox Nightlies :).

1982 El Camino restoration – Days 3 & 4

Since my last post ( where we decided to paint portions of the car the wrong color ) we decided that it would be best to just bite the bullet and paint the rest of the car the same color as our screw up.  The new orange isn’t that bad, it’s pretty bright but thats no big deal.  So day 3 began with us sanding off the old clear coat and paint and getting it ready for primer again.  This meant re-taping the whole car again ( all of the chrome, windows, bumper, ect ) and sanding down the old paint a bit so the primer would stick.  This required a lot more work then it sounds like, as the taping alone took many hours.  Sanding off the old paint, even tho there wasn’t much of it, was a huge pain in the ass.

The old paint was difficult because we layered so much clear coat on in hopes that if the paint did turn out a bit shitty, that we could sand any imperfections out of the paint.  This actually turned around to bite us in the ass as sanding clear coat of sucks the big one.  But we bit the bullet and spent almost and entire day painting and taping; it was hard work but the car should be better because of it.  That ended of our weekend a week ago and we didn’t end up doing any work during the week as we all have fulltime jobs and are pretty busy.

Yesterday we were able to reconvene to begin working on the rest of the sanding that needed to be done and prep it for primer.  We went over the body once more with sandpaper and finished the hood that we never got to the previous weekend. The hood in particular was a bitch to sand down as we dumped the remaining clear coat on there. Once all of the sanding and taping was done we blew off all of the dust that the car was covered in and whipped down the car with a small amount of paint thinner to get any of the remaining paint to come off.  Now we were ready for some primer!

The primer we used this time around was pretty old and stank like hell.  It ended up going on pretty thick so it looks like were going to have a lot of sanding to do again today to get the car ready for paint.  The primer began clogging the gun a bit so the paint went on a bit cottage-cheesy ( some chunks came out and made small bumps in the paintjob ). Luckily this was pretty contained and only ended up on the hood, so we should be able to concentrate our efforts there in sanding and it should turn out pretty well ( crossing my fingers on that one ).

In addition to the sanding and taping yesterday I also got a bit of a surprise when I got to the shop yesterday.  My brother found me a set of headers for my car and was just finished installing them as I rolled up.  I was stoked to see how they sounded so he fired her up.  In the video you will see my brother have to open the door and pop his head out mid way through because it was so loud and the exhaust fumes were coming through the floor ( there are massive holes in my floor at the moment ahaha ). It sounds amazing currently and can’t wait to put an exhaust on it ( I decided on purple hornies ). I also took some pictures of the fucked up floor and the under part of the car so you can see what we are doing after the body work, which is replacing the floor.

Today were painting the rest of the car, so hopefully it runs out! More pictures and another post to come later.

Implementing video playback statistics in Firefox – part II

Over the past few days I have begun working on implementing video playback statistics in Firefox.  I ended my last post with some build failures and couldn’t figure it out at the time.  The first part of the statistics that I began implementing was mozBytesReceived ( scroll down ), which is a readonly property on a media element that tells the user how many bytes have been downloaded so far. I continued working on this for the last 2 nights and finally I got it to build again.  The problem I’m now encountering is that I have not implemented the feature correctly and ran into a few cases where my initial idea was just wrong.

The build error I was getting last night was:

Undefined symbols for architecture x86_64

It also mentioned that a function was not properly defined, so after get Jon to look at it as well that’s what I did. I found the problem pretty quickly and it ended up that the compiler was telling the truth ( who would have thought ), I was indeed missing a function declaration.  It actually turned out that in some of my trial and error coding I tried to implement my bytesDownloaded() function in both nsBuiltinDecoder.cpp and nsMediaDecoder.cpp, which was obviously wrong. I did a bit more searching and came across a piece of code that was referring to media statistics, so I took a more in depth look there.

I ended finding a statistics struct that defined some properties that looked similar to what I was working with ( mPlaybackRate, mDownloadRate, mTotalBytes, etc ) so I figured this would be a good place to see if I could begin working.  After looking through the nsBuiltinDecoder.cpp file I found a GetStatistics() function that defined some values for the statistics struct.  The function from a high level explanation did the following:

  1. Makes sure we are on the main thread using an assertion
  2. We create an instance of the Statistics struct
  3. We check to see if we have a resource yet, if we do get the values for the various struct properties, else provide some sane defaults

What I did next was define my own mozBytesReceived property in the struct and set it up so if there was no resource we would return 0 ( as there is no bytes downloaded yet ) and if there was a resource I would call my GetBytes() function I made previously on mPlaybackStatistics. So after doing this and building again ( in addition to removing the duplicated code I mentioned earlier ) I finally got it building again! The next thing I did was test to make that my implementation was correct and that I could access the property on the video element.

The first thing I did was find myself a video page and created a little test function in the console. My test function looked like the following:

setInterval(function(){ console.log( document.getElementsByTagName( “video” )[0].mozBytesDownloaded ); }, 100);

What it did was set an interval on which I would fire a function that logged the result of mozBytesDownloaded. So this seemed like a good idea to me so I ran it, which gave me lots of numbers!!!!

At this point I thought I had figured it out as I was getting numbers to flow through the console, I thought I was done! I should have known better that it’s never that easy, nothing ever works the first try. I let the video finish and all seemed well at first, until I decided to click back to the start of the video. To my surprise the amount of bytes “downloaded” continued to rise. Wait, what? Why is it doing that? Well after some searching I found out the culprit was the GetBytes function I wrote yesterday:

PRUint64 GetBytes() {
return mAccumulatedBytes;

The problem was the mAccumulatedBytes wasn’t doing what I thought it was. It wasn’t returning the total amount of bytes that were downloaded, but rather the total amount of bytes that were decoded, which wasn’t exactly accurate.  Back to the drawing board I suppose.

At this point I am trying to see if there is a better place for me to hook into in the decoder somewhere.  I looked for a while last night but didn’t find anything that overly caught my eye.  I figure tonight I’ll head over to the #media channel in IRC and see if someone there can point me in the right direction.

Implementing video playback statistics in Firefox

Early this year I began my second course in Open Source development taught by David Humphrey.  I have been working on various Firefox bugs since the beginning of the semester and have landed 2 tickets so far, with a 3rd on the way.  The 3rd ticket that I am working on is almost done and just requires a few tests to be written then it should be good to go.  While I muster up some motivation to write the tests I figured I may as well begin working on the biggest ticket I took on this semester, which is implementing the rest of the video playback statistics in Firefox.  I was initially going to work on it alone, but Mathew Schranz expressed interest in working on it as well, so why not, the more the merrier!

We began by dividing up the remaining features that were still needed which included the following:

  • bytesReceived – the number of raw bytes received from the network for decode
  • downloadTime – the time since first HTTP request is sent until now or the download stops/finishes/terminates (whichever is earlier)
  • networkWaitTime – the total duration of time when a playback is stalled, waiting to receive more data from network
  • videoBytesDecoded – the number of bytes of video data that have been decoded
  • audioBytesDecoded – the number of bytes of audio data that have been decoded
  • droppedFrames – the number of frames of video that have been dropped due to performance reasons
  • playbackJitter – it is useful for applications to be able obtain an overall metric for perceived playback quality and smoothness

I decided that I would take bytesReceived, downloadTime, networkWaitTime, and dropped frames while Mathew would take on videoBytesDecoded, audioBytesDecoded, and playbackJitter.  I decided to simply start at the top of the list and work my way down, so bytesReceived was first. Since Humph went over this in class last week I had a good idea of where to start, so I picked up where we left off. Using MXR I looked up where the other parts of the playback statistics had been implemented and worked out from there.  My guess was that I wouldn’t need to do too much heavy lifting, as much of the work was probably done for me ( I mean, they must keep track of what they are downloading somehow ) so I began digging through various mediaElement, decoder, and mediaStream files. I eventually ended up pieces of code for each of the various video encoding formats and began sifting through what was there. Eventually I found a function that seemed like something I wanted:

void nsBuiltinDecoder::NotifyBytesConsumed(PRInt64 aBytes)

After looking at the function a bit closer I noticed that it called another function, AddBytes on the mPlaybackStatistics object, which seemed liked something that was related to what I was doing here ( not to mention it was referencing playback statistics! ). Looking closer at AddBytes I noticed it was adding the number of bytes passed into it to a local variable in the decoder called mAccumulatedBytes. I was pretty confident at this point that what we wanted for bytesDownloaded was simply all of the bytes that we have downloaded so far, which im pretty confident is mAccumulatedBytes.  Now I was ready to start coding!

I started by defining the attribute, mozBytesDownloaded, inside of the idl file located at dom/interfaces/html/nsIDOMHTMLMediaElement.idl inside mozilla-central. I made it a readonly attribute that was an unsigned long ( as i’m pretty sure that there will be no negative values here, so why not ).  Next I needed to find a way to reach into the decoder and access the value of mAccumulatedBytes, so I took a look at the route that was currently being taken by similar functions to get to the decoder.  I decided to look at how Chris Pearce originally did this for the already existing portions of the video playback statistics ( mozParsedFrames, mozDecodedFrames, mozPresentedFrames, mozPaintedFrames, mozPaintDelay ). After doing another search on MXR I found that this was being done inside nsHTMLVideoElement.cpp .  Seeing as the bytesDownloaded attribute applied to both audio AND video, I couldn’t do this in here but told me it needed to be done in the nsHTMLMediaElement.cpp file instead.

Inside the nsHTMLMediaElement file I saw similar getters and setters to those that were in the videoElement file, such as GetMozSampleRate, GetMozFrameBufferLength, and many more. I figured a good place for my new function, GetMozBytesDownloaded, would be at home with these guys. I did a bit of cargo culting and stole what logic I could from GetMozFrameBufferLength as it was also accessing the decoder so seemed ok to me.  What I needed to do from here was find my way into the decoder and it looked like GetMozFrameBufferLength was doing so also ( it was calling GetFrameBufferLength() ) so I decided to look where I could write my own function to access the data that I needed. The method was being defined in nsMediaDecoder.h so I figured it was a good place to put mine as well.  After doing a bit of poking around and checking out what else was in here I found the function that brought me down this path, NotifyBytesConsumed! I was feeling even more confident now in my function placement and created my own, getBytesDownloaded. I did some more cargo culting here and stole what I could from NotifyBytesConsumed.  I figured I didn’t need the check that NotifyBytesConsumed had for mIgnoreProgressData as it looked like it was pertaining to whether the video seeked or moved the play position; we only cared about how many bytes were downloaded. Instead of calling AddBytes on mPlaybackStatistics I figured I would yet again write my own function, getBytes. Just like I did previously, I looked up how AddBytes was implemented and followed suit where I could( which was done in MediaResource ).  All my function did was ensure that the video had started ( if it wasn’t started, return early ) and if it had started then return the number of bytes that we have downloaded. Alright great, now lets compile this and see what happens. Anddddd build failures 😦

I managed to fix a few failures but I am currently stuck on one that looks like the following:

Undefined symbols for architecture x86_64:
“nsMediaDecoder::bytesDownloaded()”, referenced from:
nsHTMLMediaElement::GetMozBytesDownloaded(unsigned int*)in nsHTMLMediaElement.o
ld: symbol(s) not found for architecture x86_64

It’s gotten pretty late now, so I am going to tackle this in the morning and hopefully I figure it out then.