Google Blogoscoped

Forum

Google Strict vs Google Deprecated  (View post)

Seth Finkelstein [PersonRank 10]

Thursday, August 10, 2006
17 years ago13,153 views

I think you're underestimating the importance of backwards-compatibility. Google's home page has to be *bulletproof* – it should work on as many browsers as possible. Using a .PNG instead of a .GIF is then ruled out by such a requirement.

I suspect CSS may cause some older browsers to crash.

Philipp Lenssen [PersonRank 10]

17 years ago #

> I suspect CSS may cause some older browsers to crash.

No... older browsers completely ignore it. So the worst that can happen – and this is true even for such things as the deprecated font tag which doesn't work in very old versions of Netscape – is that some layout preferences are not shown.

The GIF vs PNG was just an aside... you are right, PNG degrades *much* worse than GIF.

TOMHTML [PersonRank 10]

17 years ago #

They use Gzip, so there is no really difference

/pd [PersonRank 10]

17 years ago #

when we talk of older browsers.. and backward compataiblty.. lets be realstic on that.. how many users are actually using IE5 or netscape1.0 ?

Tiago Serafim [PersonRank 4]

17 years ago #

I think that having more external files(like you did with the .css and the .js), may cause an huge increase of the number of requests(and connections, depending on connection type "keepalive|close"?) on Google servers.

NateDawg [PersonRank 10]

17 years ago #

off subjet but inline what /pd said

When espn was debating whether or not to support older browser they pulled up site stats and found this

<quote>
The only substantial groups among the non-compliant browsers were IE 4.x at 1.32% and Netscape 4.x at 1.17%.
</quote>

ESPN.com conversion to CSS is a great example on how a few bits add up. What Tonys' suggestion could save Google millions of dollars with the amount of traffic they recieve.

Link:
http://www.mikeindustries.com/blog/archive/2003/06/espn-interview

pri [PersonRank 5]

17 years ago #

[put at-character here]NateDawg- ESPN and google are much differnet sites in terms of users. ESPN i am guessing is more or less for the US population. Google on the other hand is used by almost ALL computers in the *world*. It might be true that 1.32 people in the US use IE 4.x but in other countries in Asia/Africa/and prob europe, that percentage is much higher. So cutting those old browsers might not effect ESPN drastically but it would effect google a whole lot more. like seth said up top, it needs to be bulletproof.

a suggestion would be though that with a certain user (or country or other demographic), google could send a different type of page (one with css/js/xhtml) and link them accordingly. its highly reasonable to assume, most people are using the same browser and version across most computers. that way they could decrease the bits and bytes, but i don't know know how much it would help them.

/l [PersonRank 1]

17 years ago #

Nate, 1.32% is alot of traffic for google...

Ryan Bergeman [PersonRank 0]

17 years ago #

The "strict" page doesn't even validate:

http://validator.w3.org/check?uri=http%3A%2F%2Fblog.outer-court.com%2Fgoogle-strict%2F

Philipp Lenssen [PersonRank 10]

17 years ago #

Sorry, that was a PHP error, not an HTML error. I validated the HTML offline – it was valid – but turned it to a PHP on my server. It's fixed now.

PHP bug:
header('Content-Type: text/html; charset=CHARSET=UTF-8');

PHP fix:
header('Content-Type: text/html; charset=UTF-8');

Tony Ruscoe [PersonRank 10]

17 years ago #

Philipp – The action's broken on your search box in that Google Strict page:

i.e. this: http:// blog.outer-court.com/search?q=google

should be: http:// blog.outer-court.com/search.php?q=google

Abigail Moongazer [PersonRank 0]

17 years ago #

It's a little sad to hear some of the excuses for not moving to standards being touted as facts still in 2006. It's also sad to hear regional areas being trotted out as a reason to not use standards.

The true fact of the matter is standards degrade VERY nicely, load faster (if Google had a design that was even a tiny bit more complex, this would be more significant, but the argument that they've shafted themselves with a minimalistic design that doesn't allow them to tout their own innovations is a different argument altogether), and this is *more* important to areas with dial-up access and old systems than getting identical looking websites. My basis for this is friends and family who work in various regions all over the world with crazy things like dial-up access on party lines. They don't have time for things to load without losing their connections when people force visual compliance for non-critical elements of information-based pages.

Jonathan Harris [PersonRank 0]

17 years ago #

*sigh* The fact is, apart from this page, really no one CARES!
Why should they? Take the top 10 sites – Amazon, Yahoo, Google, MSN, Youtube, delicious, myspace. Do they validate? Of course not – the web is about functionality and information, not the anorak who fritters his life away worrying about whether a machine is going to validate a web-page properly....

Philipp Lenssen [PersonRank 10]

17 years ago #

> The action's broken on your search box in that Google Strict page:

Yeah I left all of the links from the original Google page so most are broken.. but you're right, why not let this point to the Google Blogoscoped search... done!

Philipp Lenssen [PersonRank 10]

17 years ago #

> not the anorak who fritters his life away worrying
> about whether a machine is going to validate a web-page properly....

The point is not to validate for validitiy's sake – tho that also has some pros* – the point is that Google says they do worry, but they worry for the wrong reasons (file size). CSS in general makes your developer life easier, not harder. Yes, there is a dramatic bonus when it comes to cross-media support, but that has little pragmatic effects in 2006 (because of broken browser implementations on e.g. mobile platforms).

*The bonus of validation for the sake of validating (i.e. even getting rid of errors that don't cause problems with browsers) is that you have an easier life debugging, as you will see *important* errors (those that do cause problems with browsers). Disclosure: my blog doesn't validate whenever I embed Google Videos, Yahoo Search Builder and such. :)

Ludwik Trammer [PersonRank 10]

17 years ago #

I think this is important, because google should be example for webmasters. Philipp should ask Matt for a comment about this.

Deryck Hodge [PersonRank 0]

17 years ago #

Google always sites snappy page loads, rather than page size. While page size affects load times, it doesn't affect render time.

I wonder what the difference in speed would be for the browser to render as strict versus old-school markup. Is there any difference? Is there a sane way to test that.... ?

Tony Ruscoe [PersonRank 10]

17 years ago #

I'm not sure I understand the difference between render time and load time. Do you mean *download* time?

<< Is there a sane way to test that.... ? >>

You could set a client-side (i.e. JavaScript) timer to start at the very top of the page and then alert the time "onload". That would theoritically show you the time it takes the page to load/render in the browser (as opposed to the download time).

Colin Colehour [PersonRank 10]

17 years ago #

I believe they use deprecated tags because the google homepage will work in just about any browser. Being Standards compliant would be nice but realistically how many browsers are 100% standards compliant ? I think that can be answered with "None". All of the newest browsers implement the standards in their own little way and not all of it is the same across each browser.

Last time I checked there was what like 1 browser that passed the Acid2 test, it might be up to 2 now.

Also, if your site had 60% of all internet searches and you could save gigs of bandwith by leaving out quotes and what not, I would do it with my site too.

Ludwik Trammer [PersonRank 10]

17 years ago #

> I wonder what the difference in speed would be for the browser to render as
> strict versus old-school markup. Is there any difference? Is there a sane way to
> test that.... ?

There is HUGE different in rendering time between HTML and XHTML (with proper MIME Type). In case of HTML browser has to be compliant with all improper crap from last 15 years and has to guess what HTML mistakes webmaster will do. XHTML is just strict, simple XML, so browsers render it faster and use less CPU.

Philipp Lenssen [PersonRank 10]

17 years ago #

Well, when the browser encounters Google.com it switches to Quirks mode, which uses old rendering bugs. If it encounters a doctyped page, it will render the page in the modern Complaint mode. You can see this by right-clicking any page and checking the properties.

By the way, this should be proof Google doesn't really care about *very old* browsers (Netscape 2 on Windows 98, pointed to the current Google homepage, returns several errors and page layout problems – as does almost any other page, valid or not, of course):

http://blogoscoped.com/files/netscape-2-google-com.png

Ludwik Trammer [PersonRank 10]

17 years ago #

> how many browsers are 100% standards compliant ?
> I think that can be answered with "None".

Every modern browser except Internet Explorer is almost standard compliant. IE isn't, but this doesn't mean that your pages shouldn't be standard compliant. It means only that you shouldn't use those particular standards that IE doesn't understand (in fact you can for other browser's sake, as long as it degrades well).

Internet Explorer follows basic standards, and you can in proper standard-compliant way do almost everything that you can do in a deprecated way. It's only advanced standards that IE doesn't know how to use.

pauldwaite [PersonRank 1]

17 years ago #

> "CSS can be used so it degrades nicely, meaning that say Netscape 3 users might get a gray background instead of a white one."

If maximum brand consistency is your main driver, then a gray background for Netscape 3 users isn't degrading nicely. As mentioned above, you never really know who's using what old browser to view your site, and there's no guarantee that people won't start using new browsers that don't support CSS, e.g. lots of web-enabled phones.

I'm not saying maximum brand consistency *should* be your main driver, or that it *is* Google's, but bear in mind that the benefits of conceptually clean HTML + CSS don't come without drawbacks. The fact that they often outweigh the drawbacks doesn't remove the potential for cases where the approach is unsuitable.

Jason [PersonRank 0]

17 years ago #

Hmmm, anyone else note the errors on this site? :-P But again, who really cares?

http://validator.w3.org/check?uri=http%3A%2F%2Fblog.outer-court.com%2F&charset=%28detect+automatically%29&doctype=Inline

Ludwik Trammer [PersonRank 10]

17 years ago #

I've just tested both version in Mosaic, the first popular graphical web browser ever. Both doesn't look very well, but are fully functional. I had to save XHTML version on my local disk, because Philipp site doesn't work in Mosaic at all. Mosaic doesn't support "Host" HTTP header, so it always displays the default page for a given IP, in this case http://212.227.103.36

Nobody uses browsers that doesn't support CSS, because they are ancient and any modern webpage doesn't work in them. Of course I don't talk about modern non-graphic browsers, like text, audio or mobile browsers – modern web standards are designed to work with them.

Joe Clark [PersonRank 1]

17 years ago #

This isn't a question of 100% browser support of standards, which, with the many permutations involved, is functionally impossible. There is nothing esoteric about the HTML used on the Google homepage.

Also, if you wanted a better comparison, use HTML with all optional elements, like html and head, removed, and unnecessary closing tags omitted. (Don’t use a DOCTYPE, obviously.) Then do a comparison with *that*, which people could also try on Mosaic and Netscape 2 and suchlike.

Shii [PersonRank 1]

17 years ago #

Remove all the retarded /s and make it HTML4, and you save even more space. Because XHTML is the stupidest thing the W3C has ever done.

knalli [PersonRank 1]

17 years ago #

Ehm.. first all: delicious tries to be valid strict – and i know that there was a valid version.. actually, the current page is not so good.

And then.. it is not only the "xhtml thing".. if a page has a semantic structure, it could be operable in non-css viewers or browsers (and here: these is not only old browsers, but screenreads e.g. for blind users).

Xhtml: It it not a stupid thing – it is logic und based on Meta-Xml.
Html 4 Strict is okay, and it is only a short way to xhtml 1.0 – but in fact, all other htmls are crap..

I think, a rewrite with HTML 4 Strict and CSS and so on will have the same result..

Jared [PersonRank 0]

17 years ago #

XHTML 1.0 Strict is supposed to be served with a application/xhtml+xml mime type, if you're not going to do it properly you should use 1.0 transitional, either way the browser is going to treat it like tag soup just as if you were using HTML 4.01.

Ludwik Trammer [PersonRank 10]

17 years ago #

> either way the browser is going to treat it like
> tag soup just as if you were using HTML 4.01.

That's not true. Firefox or Opera treat pages with "application/xhtml+xml" as XML and behaves in completely different way. Internet Explorer doesn't displays such page at all. But that's not true you have to use this MIME type. it's just suggestion.

Charles Verge [PersonRank 1]

17 years ago #

Updating html as suggested provides little benefit to the users and causes more traffic for the google servers. If you are going to have 2 files you will need to add in at lest .5 k for the extra header / http exchange. That means your css + js file really takes up 3.35. Which is 10 % more then the stated 3.08 k. Let alone the extra ram required for the increase in connections.

With that in mind I aggree with Tiago Serafim.

Ludwik Trammer [PersonRank 10]

17 years ago #

> That means your css + js file really takes up 3.35.

That's true, but most browser keep those files in cache and some of them use teachings to download all the files in one request, like pipelining.

Philipp Lenssen [PersonRank 10]

17 years ago #

> Hmmm, anyone else note the errors on this site?

I already mentioned above that my blog doesn't validate whenever I include Google Video or Yahoo Search Builder code. This is just showing that Google doesn't care much about validating. Try a page without Google code in it* and you'll see it validates.
http://validator.w3.org/check?uri=http%3A%2F%2Fblog.outer-court.com%2Farchive%2F2006-08-10-n50.html

Philipp Lenssen [PersonRank 10]

17 years ago #

Jared
> XHTML 1.0 Strict is supposed to be served with
> a application/xhtml+xml mime type, if you're
> not going to do it properly you should use 1.0
> transitional

The W3C says you are *allowed* to serve XHTML with an HTML doctype:

"XHTML Documents which follow the guidelines set forth in Appendix C, "HTML Compatibility Guidelines" may be labeled with the Internet Media Type "text/html" [RFC2854], as they are compatible with most HTML browsers. Those documents, and any other document conforming to this specification, may also be labeled with the Internet Media Type "application/xhtml+xml" as defined in [RFC3236]."
http://www.w3.org/TR/xhtml1/

But this is an endless discussion, so I'll leave it at that :)

Charles Verge
> Updating html as suggested provides little benefit to
> the users and causes more traffic for the google servers.

No, the CSS is cached much heavier on clients than HTML, so overall traffic may decrease.

Matt Nordhoff [PersonRank 1]

17 years ago #

You may be allowed to send it as text/html, but it ends up being treated as tag soup, AFAIK.

FWIW, OptiPNG would probably get the size of that PNG down even more: http://optipng.sourceforge.net/

(Oh no! Acronyms!)

Ludwik Trammer [PersonRank 10]

17 years ago #

> You may be allowed to send it as text/html, but it ends
> up being treated as tag soup, AFAIK.

It wouldn't be treat as XML, but it would be in strict mode (in opposite to quirks mode), so it would be better than normal tag soup.

And you can use MIME Type negotiation, so browsers that works with application/xhtml+xml get it, and Internet Explorer gets "text/html". It's the best solution.

Andrew Hitchcock [PersonRank 10]

17 years ago #

I was thinking about this the other day. I think the argument of not validating to save space is invalid now that Google has Google Video and Gmail... which I imagine are huge bandwidth users. A single attachment in Gmail can cancel out hundreds of users worth of bandwidth from the frontpage. Also, I regularly download tens of megs of videos from Google Video.

kjpweb [PersonRank 0]

17 years ago #

Question: Does it work?
A: Hell yeah!

Q: So why would any of these standards matter at all?
A: Ah....

Sometimes it seem that people have standards, just to have them.
In this case there is NO justifyable need to apply any changes,
since the page works.

Although I am ready to withdraw this comment – if you are able to
ahow just one (1) practical benefit an average user would have by
Google adhering to standards.
(And no – the knowledge of Google
adhering to standards does not count as a benefit.)

gumbo [PersonRank 0]

17 years ago #

sheltered.

most who have commmented couldn't even imagine the amount of traffic google will have in a second let alone a day or month. Its user base is so large and vast that everyone has to be catured for and I believe that even in the oldest browsers, you could test it and it would functionally work.

Lets be clear, removing quotes and white space (even if using a compression technology and/or a CND) will save a vast amount of money -esp. as a CND isn't cheap – which can be spent on newer more expensive ventures – video etc.

Speed of delievery isn't the issue. Page load being sub second makes little difference to 99.99% of users. Delivery speed can be increased a number of ways aswell as reducing page size. A CND will do this if done correctly.

Response filters will be run to remove all white space so if a space is required, it will have to be specificed with a &nbsp;

Kids, get a grip, get out of your small websites thinking and scale what your site does a year to what google does per second. Then try and imagine how you could cut costs. Also, new techologies aren't always the answer, think about the size of market you will loose becuase you think someone should come out of the dark ages and use firefox. Maybe that 1 user who can't view that png would have made you a millionaire....

good night
x

Ludwik Trammer [PersonRank 10]

17 years ago #

> Kids, get a grip, get out of your small websites thinking
> and scale what your site does a year to what google does
> per second. Then try and imagine how you could cut costs

And? Standards compliant CSS version is smaller and it's .css and .js files are kept in browsers cache, so it uses a lot less bandwidth.

> think about the size of market you will loose becuase
> you think someone should come out of the dark ages
> and use firefox. Maybe that 1 user who can't view that png

What are you talking about? PNGs work in IE (as long as you don't use alpha channels) and w3c standards are design to work in as many program and devices as possible. It was the main goal.

SirNuke [PersonRank 1]

17 years ago #

Standards help guarantee that a web page can be rendered identically in all browsers, as well as future browsers. As the years go by and new browsers get released, it will become increasingly hard for the programmers to write rendering engines that render older, non standard code correctly. This is much the result of the fact that standards-abuse evolves. If Google were to change its pages to standard-compliant, it could pretty much guarantee that all future browsers can render Google's pages perfectly (or at least, if they don't render Google's pages correctly, it certainly isn't Google's fault). Now realistically, no web browser developer is going to release their browser without getting major sites such as Google to work correctly. However, Google has the money and the talent to make their pages standards correct, so why take that chance?

For a real-life example of standards abuse leading to disaster, I simply point to the DirectX API. Many games abuse the API (often taking advantage of unpublished quirks), leading to many older games not working in newer versions of DirectX. Because so many Xbox games abused the DirectX API (and the Xbox API in general), Microsoft has had a lot of trouble getting Xbox games to work correctly on the Xbox360.

For my opinion on the matter, Google doesn't use web standards because the orginal website was written by Page and Brin, who didn't (at the time) know HTML very well. Judging by the fact that Google's main page has (in all honesty) barely changed over the years, I think it is a logically extension that the code-style hasn't changed since it, like the page design, works well enough.

Matt Nordhoff [PersonRank 1]

17 years ago #

(In reply to [edit: corrected link – Nate: http://blogoscoped.com/forum/61418.html#id61650):

Ludwik: Ah, okay. That sounds good. Still seems a bit wrong to me to send XHTML as text/html, even if it doesn't do any actual harm.

Ludwik Trammer [PersonRank 10]

17 years ago #

> Still seems a bit wrong to me to send XHTML as text/html
> even if it doesn't do any actual harm.

I don't think says that text/html is not allowed with XHTML 1.0. There is a huge controversy in case of XHTML 1.1. My friend that is a member of w3c assured me that it's 100% allowed, some other people says that it's not. But you can always use non controversial version 1.0. And remember – it's Internet Explorer that is broken and you can't do anything about it. So it's better to use MIME Type negotiation, so the page looks everywhere the same, but IE users get the best IE can render – normal page with CSS, and modern browsers get faster page that is rendered as XML. In this case you use the most modern technology possible, and there are no downsides.

Mathew Patterson [PersonRank 0]

17 years ago #

I heard a podcast by Nate Koechley from Yahoo – he said all their research had shown that external css and javascript was actually cached a whole lot less than they expected – so they now often use scripts and css in the page to save on connections.

I think it was from his talk at SXSW.

Philipp Lenssen [PersonRank 10]

17 years ago #

> In this case you use the most modern technology
> possible, and there are no downsides.

Plus, as soon as you decide the old broken IE's distribution is getting to be very small, you can switch to real XML very easily... by just setting a new content type for HTML files.

Philipp Lenssen [PersonRank 10]

17 years ago #

> I heard a podcast by Nate Koechley from Yahoo
> he said all their research had shown that external css
> and javascript was actually cached a whole lot less
> than they expected – so they now often use scripts and
> css in the page to save on connections.

Even if that's the case you can create valid XHTML by including the CSS and JS into the HTML head...

happyfunball [PersonRank 0]

17 years ago #

Google uses gzip compression on its pages, making its page much smaller than your 'strict' version.

Also, saying css and js files are cached ignores that hmtl files are cached too. So googles page wins over the strict version there too. Its page is cached, and cached as compressed binary to boot.

Its better to have a slightly larger page, than a link to another file (such as css, an image, or js). Each request has an overhead cost in TIME. Each file request can get hung up and take a second or two sometimes.

TCP packet sizes are important here, as well as ethernet packet sizes. If your PC's packet size is 1500 (most likely) than sending a 1400 byte file takes just as long as a 900 byte one. So your 'optimization' once again is useless.

It seems to me that you should be LEARNING from Google, not trying to teach them something.

Ludwik Trammer [PersonRank 10]

17 years ago #

> Google uses gzip compression on its pages,
> making its page much smaller than your 'strict' version.

You still can gzip with strict version, so it totally doesn't matter.

> Also, saying css and js files are cached ignores
> that hmtl files are cached too.

There is HUGE difference. Browsers tend to downland new version of html pages every time, but .css, .js files and images are kept in cache. Those files are always reused in the same session.

> Its better to have a slightly larger page,
> than a link to another file

More and more browsers ask for many files in an one request ("pipelining"). And you ignore the fact that you can include the CSS and JS into the head of strict HTML and the code will be still smaller...

Ludwik Trammer [PersonRank 10]

17 years ago #

> Plus, as soon as you decide the old broken IE's distribution
> is getting to be very small, you can switch to real XML very
> easily... by just setting a new content type for HTML files.

You don't have to. There is MIME Type negotiation that does this automatically. Browsers declare if they can handle "application/xhtml+xml" MIME Type. So those who can get it, and others don't. When there will be Internet Explorer that can your server will be sending him correct MIME Type automatically.

Look here – http://slo.bednarska.edu.pl/~ludwik/naglow.php Those are headers sent by your browser. The key to this is an "Accepted" header. In browsers that accepts "application/xhtml+xml" you can see it in this header, in others you can't.

Ajay D'Souza [PersonRank 0]

17 years ago #

You have brought out a very good point. This isn't true about Google but most of the big companies.

Their codes seldom validate even as transistional. Asking for strict is like asking for too much then!

Well, probably a don't care attitude that says "we are big, who is going to stop us"

Philipp Lenssen [PersonRank 10]

17 years ago #

> Its better to have a slightly larger page, than a link to another file

I'm repeating myself: validation is independent of whether or not you decide to outsource the CSS. So this point is moot. Still, I don't believe Google cares all *that much* about bandwidth counted in bytes; they linked to Google Video from their homepage just now... the problem of bytes on the homepage must be totally nonexistent compared to that.

> It seems to me that you should be LEARNING
> from Google, not trying to teach them something.

If you believe Google is almighty and flawless, congrats to their PR department ;)

Matthew K Poer [PersonRank 0]

17 years ago #

The non-standard is only 150kb, which does add up, I suppose, with Google's traffic.

Why not just replace the graphic with colorized text?

Michael S [PersonRank 1]

17 years ago #

Personally, I have never seen any benefit from the all of these (relatively) new "standards" that have cropped up. Generally, adding extra text and increasing complexity to comply with an artificial standard makes no sense... Especially when the older, simpler way still works.

CSS was a step forward, there I will agree, but adding a DOCTYPE is a huge waste of space. And the obsession with putting everything in XHTML is simply silly.

When browsers are released that no longer can render the quirks, I suppose I will be more concerned then, but until then, I like to follow the KISS principle. Keep It Simple Stupid.

Simple, Small and effective > Standards, every time.

  

Jason [PersonRank 0]

17 years ago #

> to comply with an artificial standard.
> Simple, Small and effective > Standards, every time.

I have to agree. I think we (web developers) waste an inordinate purportion of our development time on trying to conform to the "Artificial Standards" imposed by the w3c....

The *more* standards compliant browsers (Firefox, Opera, etc) STILL make up too little of the market to develop primarily for them. Depending on your data source the current IE market share still stands anywhere between 50% and 90% of all internet traffic. And the percentages for portable browsers (phones, etc) are still being quoted in the sub 2% range, making them mostly irrelevant.

Even the "standards" that are resonably supported across the majority of browsers (like CSS 1 and javascript) suffer issue like the Firefox/IE box model differences, strange implementations of Javascript and different DOM objects.

Almost all of these "standard" are all being implemented after the fact... By this I mean that browser developers are inventing features, then the w3c is telling them and other developers how these new features should be implemented.

For example: Microsoft developed XMLHttpRequests in 2000 based on code and concepts that they had developed in 1997. In 2002 the w3c created a standard for this control and dictated how it should be implemented by all browsers (lets remember that the w3c has never released a browser). The w3c implementation is different to the Microsoft implemention, meaning that the code for this control is now different on all conforming browsers and Microsoft will now have to re-write the control in IE (which they are doing for IE 7).

Whilst I don't especially care for Microsoft, the prinicpal here is that a so-called governing body (w3c) is dictating standards for technologies being developed by other companies and we're all poo-pooing those companies for not complying to the standards that were implemented after the fact.

In the longer-term, all we will gain from standards compliance is a lack of diversity in our browsers and a slow-down of new technologies becoming available for the web as the companies pioneering the new technologies become more hindered by the standards and those of us calling for compliance.

Google are one of these pioneering companies they are working on the implementation of numerous imerging technologies (voice communications, ajax applications, desktop searchengines, etc) and these and technologies are all offered as *free to user* services.

If google find that their overall userbase will benefit (meaning that Google will also benefit) from ignoring the standards then I strongly believe that they should.

Matt Nordhoff [PersonRank 1]

17 years ago #

[put at-character here] Ludwik http://blogoscoped.com/forum/61418.html#id61990 :

I know it's allowed, and I'm not saying that it shouldn't be allowed, but it's still wrong, IMO. It's saying that XHTML is HTML.

[put at-character here] Ludwik http://blogoscoped.com/forum/61418.html#id62292 :

"Accept". not "Accepted". :P

[put at-character here] Jason http://blogoscoped.com/forum/61418.html#id63133 :

The W3C is at least partially responsible for the Amaya browser: http://www.w3.org/Amaya/

Justin Moore [PersonRank 0]

17 years ago #

The bandwidth savings would be tremendous from folks using even moderately modern browsers. By referencing external files for the CSS and Javascript, the browsers would cache those files and then not need to be loaded from the server on every page load. This was one of the main reason's that Slashdot moved to CSS...

Whiskey [PersonRank 1]

17 years ago #

Ok, maybe they are doing it because of people who use nonPC browsers? There's the @max appliance, which surfs the web, you can google stuff like this, but if i try your new version it breaks, why? because the browser is old, it does not know what to make about javascripts or css, nor even png's...

I know that you are right with what you are saying, all i'm saying is that maybe that's the reason... once all appliances get thrown out, or updated, then maybe they will start thinking about that... (then again maybe not, they are THE google right? a verb now as far as i know lol).

Philipp Lenssen [PersonRank 10]

17 years ago #

Whiskey, CSS doesn't break old browsers which ignore it. They simply will get the raw content of the page, which is good enough in most contexts (e.g. mobile browsers). And JavaScript is independent of Strict or Deprecated, Google uses JavaScript as well. The PNG on the other hand was just an unrelated issue – you might as well use GIF on a Strict page.

I think it's more realistic to say Google *doesn't care* about validation in many services, rather than to look for artificial reasons why they planned this deprectated HTML to work better. There are even services *breaking* rendering of developer pages due to a quirks mode rendering provoked by sloppy HTML (i.e. Google Personalized Homepage API).

Forum home

Advertisement

 
Blog  |  Forum     more >> Archive | Feed | Google's blogs | About
Advertisement

 

This site unofficially covers Google™ and more with some rights reserved. Join our forum!