wp_is_mobile is a trap waiting to bite you, just avoid it

What can be wrong with a function that just checks the user agent to determine if the user is on a mobile device like wp_is_mobile? even if the function works as promised the whole idea of using the user agent to detect the type of device on server side is wrong.

Using that function (or any server side detection really, but I focus on wordpress here) violates the core principal of responsive design, that you serve the same HTML to all users.

In practice you will run into trouble once you will want to cache your HTML and then you will start to sometimes get the mobile version of the site on desktop and vice versa. The “nice” thing here is that by that time the original developer had moved on and there will be someone new that the site owner will have to recruit in order to fix the resulting mess. Pros just don’t do that to client.

What is the alternative? Detect whatever needs to be detected using javascript at client side and set a class on the body element. What about people that turn off JS? I say fuck the luddites, let them have a desktop version on their mobile. OK, strike that, make your CSS mobile friendly as much as possible just don’t worry about the UX of the luddites.

Using Jetpack reduces the raw site performance by up to 20%

It is obvious that the jetpack plugin has a bloated code since no one is likely to use all of its modules,  but the interesting question is what is the actual impact of the bloat on the site’s performance.

According to tests done by [art of the jetpack team itself, having jetpack active, without even doing anything will delay the time to first byte (TTFB) by about 70 miliseconds taking 470ms instead of about 400ms.

The point here is not the numbers themselves, as it is not very clear what is the setup of the test, but the fact that it is something that actually going to be noticeable, something that actually requires more server resources.

I assume that the problem is with the time it takes for the PHP interpreter to read and interpret the jetpack source files. People that host on VPS can select a host with faster disk access (SSD), have better control of file caching done in memory by the OS, and cache the interpreted code, can and should optimize PHP interpreting aspects. People on shared hosting are just out of luck.

And the usual caveats regarding any performance related discussion apply – if you have most of your pages served from a cache then the impact performance degradation in generating one page is probably much less important to you.

There is no much point in setting a minimal file size to deflate

Nginx has an directive called gzip_min_length which you can use to instruct it to not bother trying to compress files under a certain size. I spent few hours searching for an apache equivalent setting just to realize

  1. gzipping and deflating, while based on the same compression technology differs in the generate output, especially overhead. By the names of the settings it seems that deflate is the preferred compression on apache, which has also a gzip module, while nginx can only gzip.
  2. For the cost of extra 5 bytes, deflate will send a file that it fails to compress just as it is. For small JS and CSS files, especially after minification, the likelihood of getting a smaller file by compressing it is small, so you will not end up wasting bandwidth instead of saving it unless you are really unlucky (since in the end data is sent in packets of 1k+ bytes in size). Still you waste some CPU cycles for even trying to compress, but since we are talking about small files it should not be too bad, but it would have been nice to have a method to signal apache not to bother (hopefully the compression code does it, but I don’t see any documentation for that).

301 redirections should be handled in the application, not .htacces

I see many tips and questions about how to redirect a URL with .htaccess rules. On the face of it, it makes a total sense, why should you waste the time to bootstrap  your website code (which might include DB access and what not), just to send a redirect response, when the webserver can do it much faster for you?

There are several reasons not to do it in .htaccess

  1. Unless you are redirecting most of the site, the rate of hits on a 301 should be low but the lines containing those roles in the .htaccess file still needs to be read and parsed for every url of the site, even those that serve javascript and CSS if you are using the naive approach in writting the rules. In contrast, your  application can check if a redirect is needed only after checking all other possibilities. Each check is slower but the accumulated CPU time spent on this will be lower. This of course depends on your rules and how fast your application determines that there is no match for a URL, and how likely a url is to require a redirect
  2. Statistics gathering. If you do it in .htaccess the only statistical tool you can employ to analyze the redirects is log file and they are rotating and bust a bitch to parse and collect into some better storage system.At the aplication you can simply write the data to a DB, or send an event to google analytics
  3. Your site should be managed from one console, and redirect are more related to a application level configuration then to webserver configuration. It can be very annoying if you write a new post, give it a nice url just to discover that for some reason it is always redirected to some other place without understanding why as your administration software do not know about the htaccess rule, and you probably forgot that it is there (or maybe even someone else put it there).

Will the use of HipHop VM (HHVM) help with making your wordpress site faster? unlikely

Been a while since I last heard about facebook’s HipHop PHP optimizer project. First time I have heard of it it was a compiler from PHP to C, something I have already ran into with another interpreted language – TCL/TK, and is mainly beneficial for projects that once the interpreted code (Iie PHP code) is stable and shipped there is no need to modify it. In other words you lose the ability to modify your code on a whim that is the reason why most sites today use interpreted languages.

I was actually surprised to learn that the main reason facebook was unhappy with the compiler is that the deployment of a compiled code was resource intensive and since facebook is pushing a new update once a day they started to look into other alternatives to compiling their code into machine code.

The approach they are trying now is to write their own PHP interpreter (and a web server dedicated to running it) which will use JIT (Just In Time) technology to compile PHP code into native code and execute it. As JIT proved to be a very efficient technology when applied to optimizin javascript which like PHP is an interpreted language, I find it easy to believe that it executes PHP code faster then the conventional interpreter.

But if it is faster, how come it will not make your site faster? To understand this you need to keep in mind how facebook’s scale and how it works works.

Facebook had at some point 180k servers A 1% optimization will allow them to save 1800 servers and the cost of their electricity and maintenance. My estimate based on pricing by web hosting companies is that this might amount to saving 100k$ each month. So facebook is more likely doing this optimization to reduce cost and not to improve side speed, but for lesser sites a %1 optimization will not be enough to avoid the need of upgrading your hosting plan and even if there was a cost benefit it is unlikely that for most sites the savings will be worth the amount of time that will need to be invested in changing to use HHMV and testing your site on it, especially since it is not a fully mature product yet (just because it works for facebook doesn’t mean it works everywhere)

The other thing to take into account is that by its nature facebook can do a very limited caching as essentially all the visitors are logged in users. They can still keep information in memory in a similar way to how the object caching in wordpress works, but they still need a PHP logic to bring it all together, while wordpress sites can use full page caching plugins like the W3TC plugin which produce HTML pages that serving them bypasses entirely the need to interpret the PHP code and therefor improvements in PHP interpreting is of very little importance to those sites.

It is not that HHMV is totally useless outside of facebook, just that its impact will be much bigger on bigger and more complex sites then most wordpress sites tend to be. The nice thing about it is that it is open source and therefor the can adopt the PHP JIT techniques from HHVM into the core PHP interpreter.

The importance of the priority when using the wordpress authenticate filter

have wasted two days wondering that had gone wrong with my plugin that is doing a small extra authentication because I didn’t feel like diving deep into code to figure it out, but once I did I got the answer really fast – the authentication filter has some unexpected weirdness that is unlike almost all other wordpress filters.

It is supposed to return a valid user but the initial value passed into it from the wp_authenticate function is NULL, and not as you might a valid user or error. The actual user validation is done by a core filter with a priority of 20. There is also another core filter with priority 99 that is denying login to users that were marked as spammers.

bottom line: if you want to implement a different authentication user/password scheme you need to hook your function on a priority which is less then 20. If you want to just enhance the core authentication use priority 21-98, and if you prefer to let wordpress reject network spammers before your function is called use priority of 100 and above.

The idiotic change in fancybox license emphesizes why developers should leave licensing to lawyers

fancybox is a jquery based lightbox alternative. Its version 1.0 was distributed under a very permissive MIT license, but for version 2.0 the developers apparently decided to try to monetize their success and changed the license to Creative Commons Attribution-NonCommercial 3.0, which basically doesn’t allow usage for commercial purposes.

I am all for people getting payed for their work especially when it is so successful, but was the license change the smart thing to do? I think no

  • while the wordpress world shows that you can make tons of money from offering GPL software, with several themes and plugin developers doing nice amount of money from their work, it is strange to see someone trying to go against the tide.
  • Noncommercial  – is meaningless term, as almost no one put the effort to make a nice site without expecting to monetize it in some way. It might be direct as a shop site or running ads, or less direct as a site to build reputation. This is basically a problem with most CC licenses as they are not intended to be used for code, this is something in which a lawyer’s advice might have prevented.
  • How are they going to discover that anyone had broken the license terms, and if they do, they are unlikely to have the money to sue people all over the world.
  • What incentive is there to not pirate the code? Pirating is very easy and they don’t offer any additional service like support, therefor only people that would have been willing to “donate” in the first place will be willing to pay for the license. It might even be that they might have been willing to donate more then the request price.
  • It is easy to circumvent the license by placing the JS file on a different domain which is truly non commercial and use it in the main domain.

We can’t know how many users this change had cost to the developers, but by the look of the site I assume the monetization scheme didn’t work too well for them. Maybe it is time to change the license to something less restrictive.

Every user that had loaded any page of your site is your user

I found that I am annoyed with the way wordpress classifies users, there are administrators, editors,authors, contributors and subscribers. This classification is based entirely on what can the user access on the wordpress admin, but most people that use you site don’t have an account and therefor they are not classified at all, which is a big mental mistake.

Users without an account can be

  • casual readers – access your site at random intervals
  • follower – reads every new post or checks you site every week
  • commenter – leaves a comment
  • rss subscriber – follows update in rss
  • email notification subscriber
  • news letter subscriber
  • discussion follower – following comment updates via RSS or email.

And maybe there are more types. This kind of profiling your users should help you in monetizing your site while keeping all your users as happy as ossible.

For example, some sites don’t show ads to logged in users, treating them more as partners then source of income, but maybe it will be wise to treat commenter the same way?

wordpress.com requires its registered user to login to be able to comment

When I try to comment on sites hosted on wordpress.com and I use my main email address I get a notice that says something like “The email being used is already registered with us, please login to your account”.

I guess that the idea is to try and prevent people from impersonating another commenters, but the implementation is an awkward one as it assume that everyone is an impersonator until proven innocent and add yet another step, for anyone not currently logged in to wordpress.com, in sending a comment. I wonder how many people just abort the comment at that stage, I know I have done it at least once.
It is also strange that you have to identify against wordpress.com when there are other identity providers like google, facebook and twitter which can also be used to verify the email address.

And all of this is because the idea behind the gravatar service, which is now fully integrated into wordress.com, is naive – you should not identify people by something which is a very public information like their email address period.

What could they have done better? This should have been an opt-in kind of service.I don’t think the chance of anybody trying to impersonate me is higher then zero and I am willing to take the risk in order to have easier life. In addition the best way to verify an email address is by actually sending an email to it and asking for an action to be made. Maybe something like “we detected that you are commenting on xxxxx, if it isn’t you, you can remove the comment by clicking the link yyyyy”. Sure there is a risk of spamming the email address that way, but it might be effective enough to reduce the impersonating attempts to zero.

great day for the users of the web: yahoo and google had smacked the adware maker babylon

It was reported that google will not renew its agreement with babylon, a report that sent babylon stock in the Israeli stock exchange to nose dived 70%. This came about a week after yahoo sent a message to babylon that it is extremely unsatisfied with the way babylon products behave.

Not sure what is babylon? babylon used to be the developer of a translation software which you actually had to pay in order to use. But at some point the people of babylon had discovered that the dark side has much better cookies and more money to offer then in the honest translation software business and they started to use their familiar and mostly love brand name to hijack browsers during their software install, and switch the search engine settings so that searches will go through babylon’s search engine which is just a proxy to google or yahoo search engines.
They made money out of these scheme because google and yahoo pays for each referal to there engines.

Since babylon made money from each search going through them, they made every effort to prevent the user from changing his search engine settings even after babylon was installed. If hijacking browser settings by itself is very annoying,making it so hard to uninstall made babylon be more of a virus developers then a legit software company.

Nothing is new here and one has to wonder why did it take google so much time to do something about it.

will conduit be the next one to be smacked?