Monday, January 31, 2005

I mentioned a few weeks ago that I wanted to buy a quiet computer. I did end up ordering a computer just like Nelson’s from endpcnoise.com. The store has good (but few) ratings at resellerratings.com. It’s taking a lot longer than I expected to get the computer though. I ordered it on January 12. It’s now January 31, and as far as I can tell, the computer hasn’t shipped yet (although they have charged my credit card already!). I’m glad I didn’t pay extra for faster shipping. It would have been silly to pay $150 to have the computer in 24 days instead of 26. The order status on their online page has been showing “Pending” since I ordered. I really miss the feedback that I get from Amazon—an estimate of when it will ship, whether it has shipped, and what the shipping tracking code is. That’s one of the reason I shop at Amazon but not at most other online stores: I know what to expect.

Monday, January 24, 2005

In some of the shows I see about time travel (and there aren’t that many), the claim is made that if you travel faster than light, you will go back in time.

I don’t see why this would happen.

If you read the laws of special relativity, you will find the formulas relating time and space. One of them (Lorentz contraction) tells you the rate at which you travel through time is sqrt(1 - (v/c)2), where v is the rate at which you travel through space. The quicker you go through space, the slower you go through time. But what happens when v > c? You get the square root of a negative number. You do not get a negative number; you get an imaginary number.

Alternatively, you can think of it as the Pythagorean theorem, where one leg of the triangle is speed through space, one leg is speed through time, and the hypotenuse is the speed of light. (Is this Minkowski space?) In this model, it makes no sense to go faster than light. You can’t have a triangle where one leg is longer than the hypotenuse.

So it’s not at all obvious to me how going faster than light makes you go backwards in time.

Another thing that bothers me is the talk of wormholes. Oh sure, if you could bend the entire universe in half, you might be able to go back in time. Bending the universe seems like a harder problem than time travel.

I do think backwards time travel could be possible. However, I think the only reasonable way to make it would be to travel from one universe to another. You have to believe in the many-worlds theory first. I don’t believe backwards time travel into the same universe can work.

Monday, January 24, 2005

Google and a bunch of other folk are now using nofollow to mark links that are not endorsed by the author of the page. I just learned that Wikipedia is using this for their external links. (I’m not convinced this is a good thing, but given that Wikipedia can be edited by anyone, it’s reasonable.) How did I learn this? By using this Firefox/Mozilla custom style sheets.

Edit chrome/userChrome.css and add this:

a[rel="nofollow"]:before {
  content: "[nofollow] ";
  color: #393;
  font-size: 8pt;
}

Now all nofollow links are marked with [nofollow]. To choose a nifty symbol instead of a word, pick a unicode symbol and use the hex code prefixed by \00 in the content: line. For example, to choose ☢, use content: "\002620";.

Monday, January 24, 2005

In shopping for a new gaming computer, I wanted to do some research on video cards. The main vendors for consumer (gaming) cards are ATI and NVIDIA. What do these companies offer? Why do they offer these products?

ATI has the Radeon line; NVIDIA has the GeForce line.

ATI has several generations, named Radeon, Radeon 7xxx, Radeon 8xxx, Radeon 9xxx, Radeon Xxxx (example: X800). ATI is at the end of the line with “X”; they’ll have to do something else soon. NVIDIA has several generations, named GeForce 256, GeForce 2, GeForce 3, GeForce 4, GeForce 5xxx, GeForce 6xxx. Notice that they switched to something closer to ATI’s naming scheme. I wonder if their next generation will be the GeForce 7xxx, which will overlap with ATI’s numbers, so maybe they too will change naming schemes.

Within each generation they have several lines (low end, high end, etc.). ATI’s Radeon 9xxx comes in 9000, 9100, 9200, 9250, 9500, 9550, 9600, 9700, 9800. NVIDIA’s GeForce 5xxx comes in 5100, 5200, 5300, 5500, 5600, 5700, 5750, 5800, 5900, 5950 (I guess they ran out of numbers). The lines differ in functionality (vertex and pixel shader models, DirectX support, OpenGL support, antialiasing, shadow and lighting effects).

Within each of these they have variants. The Radeon 9600 comes as 9600, 9600 PRO, 9600 XT, and 9600 SE; the Radeon 9800 comes as 9600, 9800 PRO, 9800 XT. The GeForce 5600 comes as 5600, 5600 Ultra, 5600XT, and 5600SE; the GeForce 5700 comes as 5700, 5700 Ultra, 5700LE, and 5700VT; the GeForce 5900 comes as 5900, 5900 Ultra, 5900XT, and 5900ZT. Note that they aren’t really consistent in the variant names. NVIDIA doesn’t even use the same variants in each of their products. And whereas ATI’s 9600 XT is better than the 9600, NVIDIA’s 5900 XT is worse than the 5900. These variants differ in GPU speed, memory bandwidth, number of pipelines, and amount of memory. They also have “mobile” variants that consume less power and are therefore useful for laptops.

It also turns out that the same game will look different on different video cards. The rendering algorithms behave slightly differently due to differing spatial and temporal anti-aliasing techniques, anisotropic texture mapping, “shader” models, and support for various features in DirectX and OpenGL. (From what I read, NVIDIA is better at OpenGL and ATI is better at DirectX.) So you can’t only compare performance and price.

So there are lots of video cards to choose from, and each one of them available from multiple vendors (ASUS, Leadtek, ELSA, etc.). As far as I can tell, the vendors decide on things like fans, heat sinks, video in/out, DVI vs. VGA, etc. So you really have hundreds to choose from, and no really good way to evaluate them all. I can read Tom’s Hardware or ExtremeTech, but they tend to be a bit too much for me.

So in the end, I read about several product lines (GeForce 5700, GeForce 5900, Radeon 9800, Radeon X800) and a little bit about the variants and ended up choosing one. But I wasn’t happy about it.

Why do the video card manufacturers do this? I think confusion helps their profits. If I read reviews by searching for nvidia geforce 5900 review, and if I find products by using a comparison shopping engine, what will end up happening is that I will decide on a line based on reviews and then choose the cheapest variant in that line.

In practice this means that video card manufacturers should try to maximize the difference between the “plain” label and the cheapest variant. It should be better to make the plain GeForce 5900 a high end card so that it gets good reviews, but make available some very cheap variants. In fact, the GeForce 5900 comes in one higher variant (Ultra) and two lower variants (XT and ZT). The Radeon 9600 on the other hand comes in two higher variants (PRO and XT) and one lower variant (SE). So maybe my theory doesn’t hold up.

The difference between variants is so large that in many cases, the low end variant of a higher line is cheaper than a high end variant of a lower line. (Check out pricewatch.) When I last looked at this stuff a few years ago, the GeForce 4 MX was cheaper than the GeForce 3. I had thought I was getting a really good deal, and then I learned more about the “MX” variant—it was somewhat less powerful than the regular GeForce 4.

There will be some enthusiasts who pick the “best” card (whatever that means), but most of us will end up confused. Actually, most of us will use on-board video and don’t have to worry about this. I think that’s an important part of the economics here—the manufacturers want to confuse the people in the middle who are willing to spend more money but do not know what they are doing.

This is all a form of price discrimination. The video card manufacturers make lots of variants to increase the cost of evaluating the cards, so that only enthusiasts find the best deals. They make their profit from the rest of us. I’ve read that price discrimination can increase overall economic efficiency, but that doesn’t help me feel any better about all the time I spent trying to pick a video card.

Monday, January 17, 2005

Jeremy Zawodny’s post about his Dyson reminded me that I too want to say: I love my Dyson vacuum cleaner. I got the red DC07 model (base model + floor cleaner but no animal attachment), in part because I love the color red. (I know, that’s a silly reason.)

Consumer Reports says that it’s not as good as some other (less expensive) vacuum cleaners. However there’s an emotional attachment that many geeks have to the Dyson. I think it’s partly the Dyson story, partly the attachment system (which is annoying but cool), and partly the ease of maintenance. Or maybe it’s because the vacuumed dirt is visible so you can tell that it’s working. I can’t tell, really.

In any case, I like it.

Labels:

Sunday, January 16, 2005

Quantum Cryptography involves using quantum mechanics to detect eavesdroppers. This technology will soon be available in commercial products. Something I haven’t figured out is how well practice will match theory.

In theory, you send each bit once and you can detect eavesdropping because the bit will be destroyed when the interceptor reads it. In practice, non-quantum networks have to handle bit errors, and I would expect quantum networks to have to do the same. They might use error correcting codes, retransmits, or other approaches, but they have to handle the loss of some number of bits. Given that, can’t I, as an attacker, steal enough bits so that both I and the intended recipient can reconstruct the original message? As I steal bits, it will merely look like the network is having some trouble, but the messages will still get through.

How many bits can I steal?

If I steal every bit, half the bits will be corrupt. If the recipient can deal with 50% bit loss, then both of us can read the error. But if you only need 50% of the bits, then I don’t have to steal every bit. So let’s suppose X% of the bits are required to reconstruct the original message. Half of these will be corrupted as I (the attacker) reads them, so the original recipient will get 100-X/2 % of the bits. If that’s greater than X, then I can eavesdrop without destroying the original message. This comes out to 33% packet loss.

I don’t know how the recipient is going to distinguish between a clever eavesdropper and the network problems that happen all the time. Given that the hardware or software normally transparently hides those errors from you, you will never see anything that looks like an eavesdropper. If the network + error correcting algorithms can handle loss of 1/3rd of the bits, then the eavesdropper can read the entire message by reading 2/3rds of the bits and 2/3rds of the bits the message will still be seen by the recipient, so both will see the message. Quantum cryptography vendors will need to say that they can’t handle too many bit errors.

Labels:

Sunday, January 16, 2005

This article on astrobio.net comes in a variety of formats. Look at the Display Options line.

  1. Email
  2. Print
  3. Fax
  4. PDF
  5. Word
  6. Excel
  7. PDB (Palm?)
  8. XML (RSS)
  9. Wireless
  10. Windows executable (EXE)
  11. Spanish
  12. Postscript
  13. MP3
  14. Mobile Catalog
  15. Larger Font
  16. Smaller Font

That’s sort of neat, but it seems like overkill.

Sunday, January 09, 2005

For my last computer, I upgraded from a Pentium 1 MMX (200mhz) to an Athlon XP 1600 (1300mhz). It was a whole lot faster, but it was also much hotter and noisier. My old computer had one fan (in the power supply); the new one has five fans. It’s so hot I named it heater. It’s so loud that I keep it off most of the time.

Nelson recently bought a quiet PC, and that inspired me to start thinking about upgrading. For my next PC, I’d like to get something quieter, and if possible, cooler. I found a good article explaining CPU heat, comparing Pentium and Athlon systems. Summary: the Pentium wattage specs are too low and Athlon wattage specs are too high, so even though both are around 90 watts, the Athlon runs cooler.

From what I remember from electronics class, power usage is proportional to the square of voltage. But CPU speed is (I think) only proportional to voltage. So for every additional 10% in speed, you need 21% more power. Someone tries out underclocking to get the reverse effect: a small decrease in voltage (and thus speed) leads to a large decrease in power consumption. Neat!

Things that seem to matter in keeping the system quiet: how much heat is produced, how noisy the fans are, and how noisy the case is (vibrations, rattling, etc.), and how noisy the hard drive is.

Things that seem to matter in keeping the system cool: CPU wattage, GPU wattage, hard drive, and power supply efficiency. If you can get a power supply that’s 80% efficient instead of 60% efficient, that halves the heat wasted by the power supply. Have you noticed that many AC to DC adapters for consumer devices like telephones are warm to the touch? That's the wasted heat. Many cheap devices have power supplies that are only 30% efficient. Ick.

I’ll probably buy a system from endpcnoise.com instead of building one myself, as I have in the past. There are just too many things I don’t understand about air flow & turbulence, quiet hard drives, heat sinks, thermal compound, ball bearings, vibration, heat pipes, heat sinks, etc., and I’d rather pay a little more to have an expert choose parts and assemble something quiet.