Feed on
Posts
Comments

In my previous post I outlined the issues with using the GoogleTV for playback and I promised to outline my new client.

The Hardware
Since a list makes this easier, I’ll present the hardware that way:

Not mentioned above is the requirement of an HDMI receiver between the TV and NUC. The NUC can be configured to use analog audio output or passing audio directly to a TV over the HDMI, but a receiver provides the best audio experience.

When installing the WiFi+BT card, the antennas are covered with a protective piece of plastic. Do not try to pull these off. Instead, remove the tape on the wires, then these coverings will slide easily down the wire exposing the contacts. I also found it is much easier to connect the antennas before installing the card.

Software
I should initially mention that I had issues getting some of the media to boot off USB ports in the back. I found that it did boot easily from the ports in the front. Also, I had considerable difficulty getting into the BIOS with my USB keyboard. In a cold boot, it would never pick up on F2 being pressed but instead only after rebooting from an OS (which is a pain if you misconfigure the BIOS to no be able to boot into the OS anymore like I did once). I found that placing a powered USB hub between the computer and the keyboard solved this issue.

The BIOS doesn’t need much in terms of settings, but I found that mine was several months out of date. I updated the BIOS to the latest, then configured the minimum fan speed to 20%. Most of the time, the fans will not spin up to audible levels at this setting. This does not affect the fan speed when the device determines it needs a higher speed, just the minimum level.

I started off with a version of OpenELEC (OE) that contained Plex. I liked the novelty of not needing any kind of SATA drive to boot and keep everything in RAM. I eventually decided that while OE has its uses, its limitations became problematic. In particular, the Bluetooth adapter would disappear and never come back without pulling the power from the device. I elected to go with a full Xubuntu install (after ordering the mSATA drive).

I followed an excellent guide for the installation procedure found in the Plex forums. I deviated in the IR installation though. I did not install lirc when I installed ir-keytable. This also means I did not need to do the Configure and Disable LIRC section. I did follow the Optional Permanent VSYNC section.

Configuring IR is slightly different that described in the guide because the remotes are different. Run sudo ir-keytable -t and start pressing buttons on your remote. You will see the scancodes as you do. Use those codes for the buttons you desire in the Configure IR-Keytable section. The keyboard shortcuts page may be of use here.

I would highly recommend searching for Plex Home Theater in the menu in the upper left and right click on it to add it to the desktop. This makes launching from a limited remote much easier.

Lastly, as mentioned in a subsequent post in the above thread, you need to disable xfsettingsd, otherwise when you turn the TV back on after turning it off, the display will never come back. This is simply:

sudo chmod -x /usr/bin/xfsettingsd
killall xfsettingsd

Gotchas
Aside from those above, there were a few gotchas I discovered.

  • If you heavily use the WiFi, the Bluetooth range will be dramatically reduced. This appears to be an issue with the hardware since it uses the same antennas for both. I tend to only see this when playing HD content. Using a IR remote does reduce the need for Bluetooth.
  • You should configure Plex to be FullScreen in System -> Advanced if not already. This will enable some other settings, such as framerate switching.
  • If you enable framerate switching (which I would generally encourage) and you desire to play something with HD audio, you may lose all audio as I did. About 80% of the time, if I play something in 24p with TrueHD or DTS-HD (I do passthrough these), the framerate switching occurs and there is no audio. Furthermore, the audio never returns until I reboot or hibernate the device. I am working a bit with one of the devs to track this one down. It seems to be a race condition with the NUC and my receiver. Setting PHT to play a trailer before the movie is a decent workaround.
  • VAAPI seems to have an issue with certain MPEG2 video. In particular, when I play an episode of The Simpsons I ripped from DVD, playback is blocking and full of green squares somewhere around 2-15 seconds in. A subsequent Intel driver update seemed to resolve this, but didn’t solve the blocky playback I saw in VC-1 content. Disabling VAAPI resolves seems to be the best solution as I have only one file that gives the CPU decoder any issue.

Customizations
The one last piece I would like to mention is the PlexAEON skin. I’ve grown to really like this skin and it is pretty easy to install:

cd ~/.plexht/addons
git clone https://github.com/maverick214/skin.PlexAeonPHT.git

After that, restart Plex and then in the settings, simply change the skin. I’ve found that on occasion in either a Movies or TV Shows section, it may not display anything after entering it. Every time I’ve seen this, hitting ESC will then cause it to display. Not sure what the deal is, but I consider it minor.

And that’s it. Hope someone out there finds this useful.

Since my last post on the topic, my client and server software have changed. In the interest of full disclosure, I should mention that I now work part time for Plex, though all of my decisions outlined in this post were made before that time.

Necessity for the change
I started to get frustrated with some of the limitations of the GoogleTV:

  • The platform seemed to become stagnate (and the pending AndroidTV hadn’t been announced yet). It became clear that some of the limitations were never going to be resolved.
  • The device is supposed to passthrough DTS, but it will occasionally fail for a second during playback. It does this both on optical and HDMI. I tended to resolved this by transcoding the DTS to AC3 with the more problematic movies.
  • The device is supposed to play VC-1, but it would stutter during playback if the content was in an MKV file. It did not if the content was in a MPEG-TS. This problem does not exist during disk playback. I resorted to transcoding VC-1 content to AVC.
  • The device is supposed to passthrough HD-audio such as DTS-HD and TrueHD. It does this with playback of a disk, but not from MKV files nor MPEG-TS.

Clearly the best solution is a computer since it has no problem with all of these.

Changing software
I started examining the possibility of porting my client over to a computer. I used this as an opportunity to learn some JavaFX as a possibility for the UI. I also ran across VLCKit as an nice little project for integrating playback in applications. Then I realized that someone had likely solved this problem before.

I ran across XBMC and Plex. It became clear that my best option was one of the two. It seemed the primary difference between the two is that XBMC is primarily designed for local playback while Plex is primarily designed for client-server playback. In addition, Plex supported the GoogleTV and Roku I already owned as clients. Furthermore, if the media file is outside the capabilities of the client, the Plex’s server will transcode and stream the result to the client. So, I elected to try out Plex.

When I made the change, my dad did as well. He noticed that some of his DVDs that went through handbrake had severe artifacts during playback. I discovered it was an issue with transcoder, which uses FFmpeg as its base. I submitted a patch to FFmpeg and asked Plex to merge the change. Afterwards they contacted me about working part time, to which I agreed.

After the data import, I was up and running pretty quickly. Since I have access to the source in most cases, I can fix any minor issues I run across. Overall, it is much more advanced than what I was doing before. I since upgraded my sound system and TV.

All that remained was a better client, which I’ll describe next time.

Using Homebrew

I’ve been using the mac long enough that I’ve gone through several of the package management systems for installing additional open source tools. I started off with Fink which I really liked since it was based on dpkg. Then it became clear that the community had switched to using Mac Ports (formerly Darwin Ports). I was a bit disappointed with this because the package management wasn’t as good as it was with Fink, but it kept pace with the newer OSs better than Fink. Now the community has shifted again, this time to Homebrew. Homebrew seems to have learned a lot of lessons from the previous. The most notable is that most of the Formulas are in binaries and they use git for the formula list rather than rsync for the port list. Also anyone wanting to make a formula would fork the repo on github, commit their formula, and initiate a pull request. Given this adds simplicity on the developers, hopefully homebrew will last longer than the others.

There is one thing that was severely annoying me which prompted me to write this post. I had installed the bash formula, which upgrades bash to 4.3 and also installed bash-completion. When using the newer bash, I discovered the tab completion was not working correctly. For example, if I typed cd Libr<TAB>ap<TAB> it would complete to Library/Application\ S, but would be entirely incapable of completing anything following the space. Even if I finished the directory name manually, it would never complete anything beyond that point. This behavior seemed to be limited mostly to cd, but it was still annoying. Anyway, there is a solution:

brew tap homebrew/versions
brew uninstall bash-completion
brew install bash-completion2

Basically, bash-completion2 is for bash 4, where as 1 is for bash 3. Be sure to follow the instructions at the end of the install otherwise it won’t work at all.

P.S. I had previously run the following to gain case-insensitive tab completion in bash:

echo "set completion-ignore-case on" >> ~/.inputrc

So, for those developers living under a rock for the past 2 weeks, Apple introduced their new programming language Swift. They stated that the language has been in development for 4 years, so it is safe to assume that the language’s definition is fairly stable. Since I wrote several posts on what Objective C can learn from java, such as this most recent one, along with what it has learned, I should at least look at Swift. I have not yet actually programmed anything yet in Swift, but I have read through it’s documentation. If I got anything wrong in this post, call me on it.

First, the good changes.

Improvements

  • Objects not pointers: In dealing with Objects instead of pointers, programmers should be less likely to produce memory access errors. This is overall a much safer language construct.
  • Optional: Swift’s extensive use of optional types means that not only is delegate code simpler, it is also safer. Furthermore, this construct is extended to weak references, making them safer as well.
  • Protocol/Class Namespace: While I never complained about it, Obj-C’s protocols and classes occupied separate namespaces. This meant that the syntax for addressing these was different. In Swift, the namespaces are the same, and a common syntax is use to reference a class or protocol. This does mean you can’t have a class and protocol with the same name, but I feel this is a small price to pay for simplicity of simply referring to a type rather than a class or protocol.
  • Stronger Types: In spite of it’s automatic typing, the types in Swift are enforced more strongly than in Obj-C. This is deceptive because the use of var would seem to indicate a weakly typed language, when it is simply inferring the type from the usage. Of course, one can be explicit on the variable type.
  • Let: When I read what let did, it simply struck me as it is behaving in the same manner as final in Java. It goes a bit further in that a dictionary or array that is declared in a let statement is also immutable. The same can be said for structs. This is a nice improvement.
  • Single Source File: I mentioned this one in what Obj-C can learn. I’m glad to see that swift learned it.
  • Generics: I haven’t looked at the full extent of their power in Swift, but I love having generics in Java. At a first glance, Swift seems to be just as powerful.
  • Inner Classes: For those who’ve never used them, this is a powerful language feature. I use these all the time in Java.
  • Override: Those familiar with Java know the annotation @Override which indicates the intent to override a super-class’s method. If the super-class’s method is not present, this annotation turns into an error, but the annotation is not required. Swift goes a step further by requiring override to override a super-class’s method. This is a great improvement.
  • Closures: While blocks were technically a type of closure, Swift brings more power to them. Good addition.

Deficiencies

  • Private Methods/Variables: This one I do not understand. Obj-C had private methods and variables but Swift seems to have nothing of the sort. If it were hard to enforce at run-time, I could understand enforcing at compile-time for now, but why is it completely missing? When constructing a class, there seems to be no way to indicate which functions/variables other classes can touch, and which they cannot. The best means seems to be to use protocols instead of the concrete classes. For a library author, this is a complete nightmare. This is odd considering how Apple feels about calling private APIs in their libraries. I hope this is merely a temporary oversight and it is coming soon as this is a deal-breaker for many. There is hope that this is indeed the case.

Still MIA

  • Abstract Classes: Combined with abstract methods, these are still missing. See my previous post for more details. Maybe when it gains access controls we will get this, but I’m not holding my breath.
  • Namespaces: While it is possible to fake some namespace in Swift, it is no where near what’s truly needed. Again, seem my previous post for more details.
  • Exceptions: Apple seems to be extremely strongly against checked exceptions and Swift has made this even worse. The language seems to be completely devoid of try-catch as well as finally and throw. This is problematic since it is supposed to be used along-side Obj-C code, which can throw. So if Swift calls a method that throws, it is wholly incapable of catching such exceptions or even cleanup in a finally block. Once again, the broken record says “See my previous post for more details.”

Conclusion
Swift is definitely a strong improvement over Obj-C. Unfortunately the lack of private eliminates it as a viable replacement in several situations. If Apple fixes this, the cases where one needs to use Obj-C are nearly eliminated. Perhaps we will get namespaces some day, but I would not expect abstract classes nor checked exceptions. Overall, good improvement Apple. Keep it up.

Disabling Nvidia

I have a MacBook Pro made in 2010 which is among the models which received faulty Nvidia chips. After this was discovered, Apple decided to extend the warrantee for the chips to 3 years. Instead of proactively replacing the faulty chips, they required that the machine exhibit the problem before they would consider replacement.

So, like clockwork, my computer’s Nvidia chip fails after the 3 years. It results in kernel panics in the GPU driver about once a week. Searching for this yields numerous similar results all stemming from the graphics card asserting its manufacturing flaw. Finally, since my computer is now more than 3 years old, Apple will not fix it without payment of several hundred dollars.

So, do I have to contend with a machine that kernel panics every week or so? Certainly not. Even Windows wouldn’t blue screen that often a decade ago and it’s far better now than it was then. There’s another solution: Download and run gfxCardStatus (http://gfx.io/) and switch it to the integrated graphics card only. This has to be redone on every login, but that’s a small price to pay.

I’ve been running with this machine for nearly a month now like this and no kernel panic yet. I did have to reboot because authd went crazy and stopped displaying all authorization dialogs, but I doubt if that’s due to the machine being locked into the Intel graphics card only; it’s more likely a bug in Mavericks.

Going to the future, whenever I get around to replacing this machine, it is extremely tempting to make sure I never buy one with an Nvidia chip again. Since Intel’s graphics cards have improved so much as of late, this is now a viable possibility.

Anyone else out there with similar experiences?

Older Posts »