• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2
aA :  -   + pdf Infos Unsubscribe

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)


Date: Tuesday, 16 Sep 2014 18:49

tl;dr: shipping is a feature"; getting the URL feature well-defined should not block HTML5 given the nature of the HTML5 reference to the URL spec.

This is a subject desperately in need of an elevator pitch.  From my  perspective, here are the three top things that need to be understood:

1) From an HTML5 specification point of view, there is no technical difference between any recent snapshot of the WHATWG specification and anything that the WebApps Working group publishes in the upcoming weeks.

2) The URL spec (from either source, per above it doesn’t matter) is as backwards compatible to rfc3986 + rfc3987 as HTML5 is to HTML4; which is to say that it is not.  There are things that are specified by the prior versions of the specs that were never implemented or are broken or don’t reflect current reality as implemented by contemporary web browsers.

3) Some (Roy Fielding in particular) would prefer a more layered approach where an error correcting parsing specification was layered over a data format; much in the way that HTML5 is layered over DOM4.

Analysis of points 1, 2, 3 above.

1) What this means is that any choice between WHATWG and W3C specs is non-technical.  Furthermore, any choice to wait until either of those reaches an arbitrary maturity level is also non-technical.  It doesn’t make any sense to bring any of these discussions back to the HTML WG as these decisions will ultimately be made by W3C Management based on input from the AC.

2) In any case where the URL spec (either one, it matters not) differs from the relevant RFCs, from an HTML point of view the URL specification is the correct one.  This may mean that tools other than browsers may parse URIs differently than web browsers do.  While clearly unfortunate, this likely will take years, and possibly a decade or more, to resolve.

3) If somebody were willing to do the work that Roy proposes, it could be evaluated; but to date there are quite a few parties that have good ideas in this space but haven’t delivered on them.

Background data:

RFC 3986 provides for the ability to register new URI schemes; the WHATWG/W3C URL specification does not.  URIs that depend on schemes not defined by the URI specification would therefore not be compatible.  Anne has indicated a willingness to incorporate specifications that others may develop for additional schemes, however he has also indicated that his personal interest lies in documenting what web browsers support.

Meanwhile, this is a concrete counter example to the notion of the URL specification being a strict superset of rfc3986 + rfc3987.  Producers of URLs that want to be conservative in what they send (in the Postel sense), would be best served to restrict themselves to the as of yet undefined intersection between these sets of specifications.

Recommendations:

While I am optimistic that at some point in the future the W3C will feel comfortable referencing stable and consensus driven specifications produced by the WHATWG, it is likely that some changes will be required to one or both organizations for this to occur; meanwhile I encourage the W3C to continue on the path of standardizing a snapshot version of the WHATWG URL specification, and for HTML5 to reference the W3C version of the specification.

Furthermore, there has been talk of holding HTML5 until the W3C URL specification reaches the Candidate Recommendation status.  I see no basis in the requirements for Normative References for this.  HTML5’s dependence on the URL specification is weak, and an analysis of the open bugs has been made, and a determination has been made that those changes would not affect HTML5.  Furthermore the value of a “CR” phase for a document which is meant to capture and catch up to implementations is questionable.  Finally, waiting any small number of months won’t address the gap between URLs as implemented by web browsers and URIs as specified and used by formats such as RDF.

Should a more suitable (example: architecturally layered) specification become available in the HTML 5.1 time-frame, the HTML WG should evaluate its suitability.

References:

Author: "--"
Send by mail Print  Save  Delicious 
New Toy   New window
Date: Monday, 16 Jun 2014 14:57

New laptop for work: MBP 15.4/2.6/16GB/1TBFlash.  First time I ever went the Apple route.  I did so as I figured with those specs, I could run multiple operating systems simultaneously.  So far, so good.  I’m using VirtualBox to do so.

Notes:

First, Mac OS X 10.9.  My biggest problem with previous versions of this operating system is that they always appeared to me to be fairly hostile to installing open source scripting languages and tools.  For example, each time I updated my Rails book, I would update the instructions on how to install the necessary software.  This now appears to be a thing of the past.  In fact, the only problem I’ve encountered so far is with mod_suexec.  That problem looks easy to address, and if it isn’t addressed by the team managing the brew recipe, I’ll simply compile the suexec bin myself.

Overall, much improved.  This is also my first experience with Apple’s trackpad; and I must say I’m a fan.

Next up, Ubuntu 14.04.  Installation was straightforward.  One only needs to be mindful to install dkms.  Enabling 3D acceleration is also worthwhile, but doesn’t quite get you to native graphics speeds on lesser hardware.  The end result is fully functional, though it is worth while to do most web browsing on the host operating system.

Then Windows 8.1.  This was by far the easiest as Microsoft provides time bombed VMs which you can easily import and use for up to 90 days.  When the 90 days are up, you can import again and start over.  I’ve now done this with both Ubuntu and Mac hosts.

Finally, Red Hat Enterprise Linux 6.5.  There were a few more steps to get this running, and even after doing so the result wasn’t fully functional in that it would not use the full display even after installing guest additions.  The solution ended up being to delete (or simply move elsewhere) the following files in the /etc/X11 directory: xorg.conf xorg.conf.d xorg.conf-vm.  I use this VM to access the IBM VPN and to run Lotus Notes.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Saturday, 07 Jun 2014 01:58

Today, I got a pull request from Ryan Grove to make nokogumbo work on Ruby 2.1 and add Travis support.  Very cool.  I was surprised how easy it was to set up.

A few hours later I got ruby2js to work on Ruby 2.0 and 2.1 and added Travis supportWunderbar worked right out of the box.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Monday, 12 May 2014 23:12

Joe Gregorio: But something else has happened over the past ten years; browsers got better. Their support for standards improved, and now there are evergreen browsers: automatically updating browsers, each version more capable and standards compliant than the last. With newer standards like HTML Imports, Object.observe, Promises, and HTML Templates I think it’s time to rethink the model of JS frameworks. There’s no need to invent yet another way to do something, just use HTML+CSS+JS.

I’m curious as to where Joe believes that these features came from.  For example, promises were first proposed in the 1970s, made their way into a number of frameworks, were extracted into a common implementation and then standardized.

The true story is that Joe’s “gradient” picture is incomplete:

There’s actually a gradient of code that starts with a simple snippet of code, such as a Gist, and that moves to larger and larger collections of code, moving up to libraries, and finally frameworks:

  gist -> library -> framework

A more complete picture:

  gist -> library -> framework -> standard

And even that isn’t complete.  Standards are backported using polyfills, and frameworks are updated to use feature detection to make use of standard implementations as they become available.

I’ll also mention a few libraries/frameworks I’m fond of, and how they fit:

  • Underscore.js.  This library implements a number of methods that really should be a part of the language.  And in a few cases, are (scan that page for the word native).  I’ve been a member of ECMA TC39 off and on for a decade and a half, and based on what I have seen, JavaScript will catch up with Underscore in 30 to 50 years.
  • jQuery.  Their official slogan is “write less, do more”.  While that’s true, “make DOM suck less” is equally as true.  Like with Underscore.js, it predated features like querySelector.  One last comment before I move on: abstract away the platform is not true for jQuery (nor for any of the libraries/frameworks I’m mentioning here).  The key abstraction jQuery provides is a collection of DOM nodes.  You can determine the number of element in the collection by using the length property.  You can access individual DOM nodes using indexes: [0], [1], [42], etc.
  • Bootstrap.  While this project contains JavaScript, its true focus is on providing a higher level of CSS constructs than the browsers currently provide.  Things like modal dialogs, dropdown menus, tabs, etc.  It is worth noting that they do this with “just” HTML+CSS+JS.  Sure, you can reinvent these concepts for yourself, but why?
  • Angular.js.  Joe mentioned that he hasn’t needed data binding yet.  I’ve written a fair amount of small web applications.  Some have grown to become bigger and unwieldy.  I’ve taken a few of these and started to separate out the client side model, view, and controller, and in the process found data binding to be quite handy.  Now I can write larger web applications, and go back and add features months later without being afraid that I am going to break anything.

In each of these cases, I’m confident that the best ideas of these libraries and frameworks will make their way into the web platform.  Meanwhile:

The future is already here — it’s just not very evenly distributed.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Monday, 07 Apr 2014 17:14
W

Slides for my ApacheCon talk.  Right/left goes to the next/previous section, up/down for navigating with a section.

The demo is unfortunately only available to ASF committers (for privacy reasons, as it exposes email addresses).

Author: "--"
Send by mail Print  Save  Delicious 
Date: Friday, 14 Mar 2014 12:49

Tim Bray: If hating this is wrong, I don’t want to be right.

Perhaps you would like this better?  :-)

module Angular::X

  controller :LoginController do
    @credentials = {username: '', password: ''}

    def login(credentials)
      AuthService.login(credentials).then {
        broadcast! AUTH_EVENTS.loginSuccess
      }.catch {
        broadcast! AUTH_EVENTS.loginFailure
      }
    end
  end

end

Try it here.

Things to note:

  • All dependency injection is taken care of for you
  • $scope and $rootScope are inferred based on context
  • Input uses Ruby syntax vs JS Syntax
  • Generated code is clean, indented, and idiomatic

More examples, from deployed code: roster, agenda.

This will be the covered by my ApacheCon talk.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Thursday, 13 Mar 2014 23:48

Backdrop:

  • Google Fiber announces it is considering new cities, including Raleigh.
  • RST announces gigabit service for Raleigh, starting as early as May.
  • My current service is “Standard Cable” (70+ channels, no premium ones) and “Standard Internet” (nominally 15 Mbps up, 1 Mbs down).  At the end of the month, I will have had basic cable with Time Warner at the same location for 22 contiguous years, and standard Internet for more than half of that.

With that context, today I got in the mail notification that my rates are set to go up by 60% as my “Promotional” rates (Seriously?  A twenty two year long promotion?) will be expiring.  After spoofing my User Agent as the chat function doesn’t recognize my browser/operating system combination, I verified this is indeed the plan with “Veronica”.  I was then provided a transcript and directed to an online survey when promptly logged me off without submitting my feedback once I had completed it.

I plan to follow up with @TWC_Help.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Sunday, 26 Jan 2014 17:55
W

I got a suggestion to look into React.js, a JavaScript library which is focused on the problemspace that Angular.js’s directive addresses.

One of the ways React.js facilitates the creation of web components is via JSX which mixes “XML” with JavaScript.  The “XML” is “desugared” into React.DOM calls.

Based on this idea, I created a Wunderbar jquery filter to “desugar” Wunderbar calls into JQuery calls.  The tests show some of the conversions.  I also updated my Bootstrap modal dialog directive to make use of this: before => after.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Thursday, 23 Jan 2014 14:52

When compared to Ruby, JavaScript doesn’t have as much functional support built in.  Underscore.js fills that gap for many.  Underscore.js, in turn, was inspired by Ruby’s Enumerable module.  A underscore filter (tests) completes the mapping.

In many cases, the resulting JavaScript is formed by applying a number of filter rules.  For example, starting with:

a.flatten!()

This is first expanded to use Array.prototype.slice to ensure the update happens in place:

a.splice(0, a.length, *a.flatten())

Next, the method call is rewritten to use the underscore flatten function:

a.splice(0, a.length, *_.flatten(a))

Finally, as JavaScript doesn’t have a splat operator, the call is rewritten using Function.prototype.apply:

a.splice.apply(a, [0, a.length].concat(_.flatten(a)))

Tim Bray will be pleased to hear that Ruby2js currently maps a.sort() to

_.sortBy(a, _.identity)
Author: "--"
Send by mail Print  Save  Delicious 
Date: Saturday, 18 Jan 2014 15:58

Ruby2JS now maps Ruby attributes to JavaScript properties:

Input Ruby:

class Person
  attr_accessor :first_name, :last_name

  def initialize(first_name, last_name)
    @first_name = first_name
    @last_name = last_name
  end

  def full_name
    "#{@first_name} #{@last_name}"
  end
end

Output JavaScript:

function Person(first_name, last_name) {
  this._first_name = first_name;
  this._last_name = last_name
};

Person.prototype = {
  get first_name() {
    return this._first_name
  },

  set first_name(first_name) {
    this._first_name = first_name
  },

  get last_name() {
    return this._last_name
  },

  set last_name(last_name) {
    this._last_name = last_name
  },

  get full_name() {
    return this._first_name + " " + this._last_name
  }
}
Author: "--"
Send by mail Print  Save  Delicious 
Date: Monday, 13 Jan 2014 17:46

Based on a suggestion by Tim Bray, I converted my board agenda Angular.js application to use html5 mode.  The process was straightforward:

1) add the following to your application configuration:

$locationProvider.html5Mode(true).hashPrefix('!')

2) Add a <base> element to my generated HTML, indicating which part of my path was “owned” by the server.

3) Convert my relative links.  Based on how my application was structured:

  • #/comments became comments
  • #/ became ./
  • js/app.js became ../js/app.js

I’ve not yet tested it with Internet Explorer <= 9, but the Angular.js docs indicate that it should work there too.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Friday, 10 Jan 2014 07:45

Tim Bray: We’re at an inflection point in the practice of constructing software. Our tools are good, our server developers are happy, but when it comes to building client-side software, we really don’t know where we’re going or how to get there.

While I agree with much of this post, I really don’t think the conclusion is as bad as Tim portrays things. I agree that there are good server side frameworks, and doing things like MVC is the way to go.

I just happen to believe that this is true on the client too – including MVC. Not perfect, perhaps, but more than workable. And full disclosure, I’m firmly on the HTML5-rocks side of the fence.

For starters, while JavaScript is perfectly satisfactory language for many, it does seem to have accumulated some weird quirks. None of that, however, makes JavaScript any less desirable as a compilation target.

While I agree that jQuery reduces the pain of accessing the DOM, there is a future in sight where user authored jQuery will largely be a thing of the past. While I wouldn’t have believed it, I’ve seen it for myself, and I’ll describe it more below.

But first, let’s talk about MVC.

Client Side MVC

On the server, the model is data. On the client, the model can be data too – either from Web Storage or from the server. But it can also be accelerometers, cameras, and contacts. If you need some the these, take a peek at Apache Cordova.

For views, there are HTML fragments, mustache and mustache inspired syntaxes for templating, and yes, CSS. While Sass and its ilk reduce the pain here; a good alternative is to pick up a library like Bootstrap. You might need to tweak it a little, but there is a lot there and the peole who wrote it probably thought about a lot of problems that you may not have thought deeply about. For starters, their markup is responsive, which means that it automatically adjusts based on device characteristics.

Finally, we come to the controller. And by association, routing. There are good frameworks for this, too, on the client. Angular.js and Ember are two exemplars.

For the remainder of this, I’m going to focus on Angular.js. As near as I can tell, a very similar story could be told about Ember. I just happen to be writing an application using Angular.js, and can point to it.

Let me start by describing the application. I’m writing it partly to scratch an itch, and partly to learn a new framework. Before I started, I had never used Angular.js before. I will say that the learning curve for Angular.js (at least for me) is an “S” curve… easy to get started, then it gets harder to advance, then it gets easier again. I seem to have made it past the curve. Personally, I would put a lot of the blame on the documentation which, to my tastes, is a bit too academic; terms like transclude abound. `nuff said.

At the ASF we have a board with 9 Directors, and a couple of hundred officers, give or take. Every month, an agenda is created that 9 Directors review and 50 plus officers contribute to. The artifact is a single file, stored in subversion. Essentially, a superset of the files you can find posted here. The relevant additions include comments and pre-approvals.

As you might imagine, this many people updating a single file, many up to the end of the very end of deadline of the start of the meeting, we have conflicts. To cope, I wrote a tool that allowed me to review individual reports one at a time, collect up comments and pre-approvals, and apply them automatically all at once. In the “production” version of this application, the logic primarily resides on the server, and moving to the next report requires a round trip.

I’m currently rewriting it to move nearly all the logic to the client. You can see the work in progress here.

Guided walkthrough the code

The application starts by downloading the agenda serialized in a pre-processed JSON format, as well as all of the “pending” operations, also serialized in JSON. Both persist on the server as flat files. Once downloaded, traversing to the next report is as easy and as seemless as using most of the HTML5 slide scripts (my current favorite of which is reveal.js, but I digress).

As on the server, processing a page transition involves a router, and you can see my routing as a case statement at the top of app.js. The syntax is Ruby, with a dash of DSL, and sprinkling of JavaScript semantics, but I’ll come back to that.

I also currently include all of the controller logic in the same file. If you disapprove, blame me not Angular.js as you can break this out as you like. As it stands, I have a few large controllers, and several small controllers. As to the former, I intend to go back and refactor soonish.

This brings me to my primary beef with the Angular.js documentation. It would have been helpful to me to be made aware of sooner that while a typical server application would have a single controller serve a dozen or even dozens of pages, a typical angular.js page will have several, or perhaps even dozens of controllers active at any one time. But now you know this too, so you have a head start over where I was at but a few short weeks ago.

Back to the application.

A controller in Angular.js is associated with a DOM node, and therefore all of its children. A controller associated with one of those child nodes inherits instance variables and methods from all controllers associated higher in the tree. A common pattern, therefore, would be to have a controller that controls only one button. Such a controller would have access to everything in the enclosing form.

Let’s walk through an example. One the pages in my application shows comments. On that page is a button that toggles whether or not to show comments that you have seen before. First, here’s the HTML markup for the button itself:

<button 
  class="btn btn-primary"
  ng-show="seen_comments" 
  ng-controller="ToggleComments"
  ng-click="click()"
  >{{ label }} seen</button>

The class attribute references classes defined by Bootstrap.

The ng-show attribute indicates that this button is only to be shown if the value of seen_comments is true. That value is set by a controller.

As you might have figured out, the ng-controller attribute indicates which controller is to handle this DOM Node.

Similarly, ng-click indicates which method on that controller to invoke when the button is clicked.

Finally, the button contains a reference to a label variable that toggles between the values of hide and show. With that, lets move on to the controller:

controller :ToggleComments do
  @label = 'show'

  def click
    broadcast! :toggleComments, (@label == 'show')
    @label = (@label == 'show' ? 'hide' : 'show')
  end
end

This code starts out by setting the instance variable @label to show.

A single instance method named click is defined which, when run, broadcasts to every active controller a message containing a true or false value, and then proceeds to toggle the value of the label.

The important thing to be aware of by this point is that even if there were no other code in place, what you have seen is enough to toggle the text on the button.

Before proceeding, here is the corresponding JavaScript, exactly as it is produced verbatim, for those who are interested in such:

AsfBoardAgenda.controller("ToggleComments", function($scope, $rootScope) {
  $scope.label = "show";

  $scope.click = function() {
    $rootScope.$broadcast("toggleComments", $scope.label == "show");
    $scope.label = ($scope.label == "show" ? "hide" : "show")
  }
});

The entire source for the comments view is found in partials/comments._html. This contains not only the button you have seen already, but also another button. Above these is text that shows up if there are no comments. And above that, a loop that shows comments from selected agenda items. Let’s dive into that selection:

ng_if: 'item | show : {seen: pending.seen, toggle: toggle}'

This code takes the item and passes it through a filter called show, passing that filter a hash containing two values obtained from the relevant controller. Here’s the definition of the filter itself:

filter :show do |item, args|
  return false unless item.comments
  return true if args.toggle
  return args.seen[item.attach] != item.comments
end

Pretty straightforward stuff:

  • No comment? Return false (i.e., don’t show).
  • Toggle on? Return true (i.e., do show).
  • Otherwise return a value based on whether the comment is the same as the one previously seen.

Now, let’s look at how values defined in the controller are set:

on :toggleComments do |event, state|
  @toggle = state
end

show = filter(:show)
watch 'agenda.update + pending.update' do
  $rootScope.unseen_comments =
    @agenda.any? { |item| return show(item, seen: @pending.seen) }
  $rootScope.seen_comments = !Object.keys(@pending.seen).empty?
end

The on statement responds to the broadcast that was shown above, and sets an instance variable based on what was passed. The watch statement watches for changes in the value of an expression, and when it changes it will recompute unseen_comments and seen_comments using the exact same filter used in the view. If you look closely, this code makes use of Object.keys which is part of the JavaScript object model, in the midst of a Ruby expression. There are tradeoffs involved here, but the key point here is that seemless access to the full JavaScript programming model is available.

At this point, we’ve explored views and controllers (and routing and filters). Now let’s briefly touch on models.

A model in Angular.js is a simple class, and often a singleton in that all methods are class methods and all variables are class variables. The Agenda class, for example, defines a self.refresh method that does a $http.get, and calls Agenda.put with the result it receives. The point is that such classes can contain arbitrary logic.

Two more stops on this brief tour. First the entire controller for the other button on this page:

controller :MarkSeen do
  @disabled = false
  def click
    @disabled = true

    # gather up the comments
    seen = {}
    Agenda.get().forEach do |item|
      seen[item.attach] = item.comments if item.comments
    end

    data = { seen: seen, agenda: Data.get('agenda') }

    $http.post('json/markseen', data).success { |response|
      Pending.put response
    }.error { |data|
      $log.error data.exception + "" + data.backtrace.join("")
      alert data.exception 
    }.finally {
      @disabled = false
    }
  end
end

This code initially sets the button to not disabled (inexplicably, the Angular.js core team refuses to define an ng_enabled attribute).

When the button is clicked, the button itself is initially disabled. Then the seen comments are then gathered up from the client model. The name of the agenda file is added, and the result is serialized and sent to the server via a $http.post.

If a successful response is received, it the pending values are updated with the values from the server. If an error is received, it is logged and an alert is shown. Either way, the button is re-enabled.

Note the complete lack of direct reference to jQuery in any part of this scenario. Angular.js will work well with jQuery if present, so that’s not an issue, but the point is that the framework will take care of the DOM manipulation so that you don’t have to.

The final stop on this brief, but wirlwhind, tour is the server side of this operation, namely markseen._json:

pending = Pending.get

pending['agenda'] = @agenda
pending['seen'] = @seen

Pending.put(pending)

_! pending

This logic fetches the Pending model, updates to entries in the hash based on data sent by the client, puts the model back, and then returns the updated model (which generally contains other values which weren’t updated by this specific operation) back to the client. For completeness, pending.rb contains the logic serializing and deserializing the server model (in this case, using YAML).

Recap

We have a model, view, and controller on the client, seemlessly interacting with the model, view, and controller on the server. Everything (except for a small stylesheet) is defined using Ruby syntax, and is converted to HTML, JavaScript, or directly executed as appropriate. While I chose Ruby, other choices could obviously be made. The Angular.js framework can also be used directly (and browing the generate JavaScript would help show you how to do this), at a cost of some additional learning curve (things like dependency injection, which are taken care of by my mapping filters for angular.js).

The point here being that there are good frameworks out there that do client side MVC. These frameworks (quoting directly from Tim’s original post):

embody[…] a lot of history and hard-won lessons. Crucially, for most of the things you’d want to put in a UI, there’s usually a single canonical solid well-debugged way to do it, which is the top result for the appropriate question on both Google and StackOverflow.

Finally, if you are interested in the Ruby code, you are encouraged to look into wunderbar, it’s associated tutorial, and ruby2js.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Saturday, 07 Dec 2013 02:10
W

I’ve begun work on a Wunderbar tutorial.

Feedback welcome.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Sunday, 01 Dec 2013 15:26
It does indeed turn out that language macros can reduce the amount of Angular.js boilerplate configuration to a minimum.  In the process I’ve spun off ruby2js is a standalone supporting library.
Author: "--"
Send by mail Print  Save  Delicious 
Date: Monday, 11 Nov 2013 20:16

I’m looking into what it would take to make it easier to produce Angular.JS client applications using a server coded in Ruby.  The approach I’m taking is to convert idiomatic Ruby into idiomatic AngularJS JavaScript.

Demo.  Corresponds roughly to tutorial step 4Example outputSpecs.

Author: "--"
Send by mail Print  Save  Delicious 
Mavericks   New window
Date: Monday, 04 Nov 2013 16:33

Did a clean install of Mavericks on my test mac-mini.  Things to be aware of for next time:

xcode-select --install

sudo ln -s /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/ /Applications/Xcode.app/Contents/Developer/Toolchains/OSX10.9.xctoolchain

sudo mkdir -p /usr/local/lib; sudo ln -s /usr/local/mysql/lib/libmysql* /usr/local/lib
Author: "--"
Send by mail Print  Save  Delicious 
Date: Tuesday, 22 Oct 2013 13:20
Mathias Bynens: Whenever you’re working on a piece of JavaScript code that deals with strings or regular expressions in some way, just add a unit test that contains a pile of poo (💩) in a string, and see if anything breaks. It’s a quick, fun, and easy way to see if your code supports astral symbols. Once you’ve found a Unicode-related bug in your code, all you need to do is apply the techniques discussed in this post to fix it.
Author: "--"
Send by mail Print  Save  Delicious 
Date: Friday, 04 Oct 2013 12:20
Leonard Richardson: Hey, folks, I got some pretty exciting news. Now that RESTful Web APIs has come out, there’s really no reason to buy 2007’s RESTful Web Services. So Sam Ruby and I and O’Reilly have gotten together and started giving the old book away. You can get a PDF from the RESTful Web APIs website or from my now-ancient RESTful Web Services site. The license is BY-NC-ND.
Author: "--"
Send by mail Print  Save  Delicious 
Date: Sunday, 22 Sep 2013 14:02
W

Opal is a Ruby to JavaScript compiler.  The team working on it are not only working to bring access to JavaScript libraries (like jquery) but also to replicate Ruby library interfaces.  Walking through a simple example...

This script defines two types of responses, HTML and JSON.

The HTML response define a simple table and a script.  The script is defined in Ruby, but is converted to JavaScript before sending to the browser.  The script itself issues three HTTP request and updates individual cells in the table when it gets responses.

Those requests produce JSON replies, depending on the individual field requested.  Both the client and server scripts in this example involve DOM traversal.  One uses JQuery style methods (find).  The other nokogiri (at and search).

An alternative to opal-jquery is opal-browser.  The latter provides a more Markaby/Nokogiri style interface to the DOM.

To run:

gem install wunderbar nokogumbo opal opal-jquery sourcify
ruby watch.rb --port=3030

If your web server is set up to handle CGI, you can drop this script directly into your document directory and run it.  If you do so, the requests will all be handled in parallel.

Author: "--"
Send by mail Print  Save  Delicious 
Date: Friday, 06 Sep 2013 00:45

I finally debugged why my  cable service was so poor.  Long story short, an inexplicable 7dB drop in the incoming line, a bad arrangement of splitters, and another unexplained 7dB drop someplace in the house.

Now for the long story:

My troubles started when Time Warner Cable required me to install mini cable boxes to see the full set of channels that I had purchased.  I went from a slightly grainy picture to a clear picture on some channels and intermittent digital encoding artifacts on others.  In most cases, slightly grainy was a marked improvement over digital encoding artifacts.  In fact, for some channels the result was essentially unwatchable - particularly channel 3 (CBS) and channel 4 (PBS).

Two days ago, one television refused to show anything (what was shown was “Searching for Channels").  I started debugging by bypassing the box, and that worked, indicating that the cable wasn’t broken.  I then swapped equipment with another room, and the problem stayed with the television and not the box.

Remembering that pressing "info” on the remote control would put the box in a debug mode, I found that the working televion showed -19.44 dB, and the failing television showed -20 dB.

Putting a signal booster on the working televison addressed the digital artifact problem.  Putting the same signal booster on the failing television didn’t help.

Working assumptions at this point: with non-digital signals, picture quality degrades linearly with signal strength.  With digital signals, viewability is more of a binary quality, and at -20dB the box simply refuses to show anything.

And there is a point after which there isn’t enough signal to be boosted.

Tracing back the line, it comes in from the street to a box, in that box there is both a 2 way splitter and a 4 way splitter, then it goes under the house and is split one final time before going to the two televisions in question.  I suspect that the final splitter was added by the builder and not by the cable company.  Similarly, I suspect that the additional 2 way splitter was added when we added a detached garage with a room on the second floor.

I then tested the signal strength at the box (before any splitters), and I found +5.6dB.

Based on this video, Time Warner should be providing me 10 to 15 dB.  So the signal strength is about a quarter of what I should be getting.  And I should be striving to get between 0 and 5 dB to each television.

The first splitter cut that in half, and the second splitter cut that by a factor of 4.  The third splitter cut that by a factor of 2.  And signal loss along the line should be on the order of another factor of 2.

That sounds like a lot, but in dB terms that’s about 18 dB of loss.  Starting with 5.6 and subtracting 18 leaves -12.4.  I am getting 7 less than that.

Looking back at my splitters, the first splitter fed half of the stength to one line.  Tracing down that line, and that was to my cable modem.  While that’s clearly dear to me, I suspect that this ordering was done when I had problem with my cable modem dropping signal.  I since have replace the modem.

Reordering the splitters so that 3 lines only go through the four way splitter and two (analog only) signals go through both means that 3 televisions get twice the signal they were before, and the cable modem gets half.

Furthermore, replacing the final splitter with a signal booster means that the two televisions that were having problems now have positive signal strength.

So far, no problem with Internet, and the immediate problem with my TV service has been addressed.

Then again, I just got a notice today that four more channels will require a cable box, which leads to the following question:

If Time Warner Cable is moving towards Digital Only service, shouldn’t they be providing enough signal strength to drive all of the devices in the house?

Author: "--"
Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader