It does indeed seem that way. Though I don't know why you would sell a really good processor for the crap processor price. Seems more logical to just make the chip itself cost 50 more bucks.
Basically, the difference between a cheap chip and an expensive one often comes down to quality control. Most of the chips that come off of an assembly line are flawed in some way, so the manufactures turn off the parts that don't work properly and sell it as a cheaper model, despite it being based on the same design as the higher-end ones.
Geez, so much happens when I'm not on the internet for like a day. I gotta read up on this myself yet, but suffice to say DI is on point. There is chip harvesting/binning that goes on where the cheaper chips are physically the same as the more expensive ones but they are run at a lower speed and/or have parts disabled (or sometimes destroyed even). If they had some such cheap parts where disabled bits were actually faulty I don't know how this would actually work. A lot of times they aren't faulty though I guess. Actually there is a notable market on the AMD front for core unlocking where triple cores and sometimes even dual cores can be turned into quad cores, but this is not actually sanctioned by AMD and cancels your warranty. But then, it's free if you have a motherboard that can do it. Just, sometimes it doesn't work if you have a chip with actual faults, so it's a gamble.
EDIT: read about it. An extra 1MB of cache and Hyperthreading don't sound like bargains for $50 to be honest.
Oh it'll probably work pretty reliably. It looks like this will apply to specific models, so they will probably make sure these can function fully when unlocked. It'd be pretty silly if they didn't.
Naturally people will complain about paying extra to allow something they bought to function at its full potential. Articles like this one are loaded with commentary.
So it is then. They are selling fully functional equipment that is being held hostage from its full potential, unless you pay an additional fee to unlock it.
It makes economic sense though, it is presumably easier to have one giant machine to knock out the exact same processor, and then modify the end result, rather than multiple assembly lines.
Especially when the material costs are in the fractions-of-cents range.
I understand the supply chain side of it, but it doesn't change the end result impression from the consumer side: they have hardware that is potentially more powerful, but they can't use it until they pay an additional fee. It's like if you pay for a iPhone, but it only has half of its memory available for storage. If you want the rest, you need to pay extra to unlock it. From a consumer side, those two scenarios are indistinguishable.
I'm doing it again. Thinking about tech things. Got nVidia on the mind. This is a company that makes graphics chips. In the past they were called "graphzilla" as a play on one of Intel's nicknames "chipzilla". This is a fact that no doubt grates terribly on nVida who has long toiled under the great shadow of Intel.
So, anyway, nVida is hosting a "GPU Technology Conference" this week. A nice follow-up after Intel's big show which finished up just a short while ago. It's got me thinking about nVidia. And not just graphics cards. They do more than graphics or maybe I should say "did".
See not so very long ago they also made chipsets. These are the chips that go on motherboards that would be the central hub that would sort of connect all the stuff together and make it all work. It sort of connects everything up to the processor. If the processor is the brain, the chipset is the nervous/circulatory system(s) if you can follow that analogy. Well, anyway, nVidia used to make these. They made some for AMD and they made some for Intel.
Well, in order to do this for Intel they need a license to produce a chipset that uses Intel's proprietary processor bus (AMD has an open bus 'cause they are all nice and shit). They got such a license from Intel for the Pentium 4's and the Core 2's, and even the lowly Atom processors, but lo and behold Intel decided that with the "Core i7/5/3" processors, they were going to use a different bus. Another new proprietary bus. Naturally they decided that nVida's bus license did not apply to this new bus.
To be clear nVida got this license through a cross-licensing agreement that gave Intel access to some nVidia tech and now Intel decided that they didn't like that bus anymore and would stop using it. Yet, legally Intel still gets nVidia goodies. Awesome deal, huh? Even more curious is that Intel has not given bus licenses out at all this time.
It used to be there were several companies making Intel chipsets years ago and now Intel is the sole provider. People always used to favor Intel's own chipset with their processors, but now it's not even a choice, ya know? I dunno, this kinda irks me a bit, but I'll live. They've shifted half the functionality of the chipsets directly into the processor now anyway. The northbridge portion is basically gone. Swallowed up by the processors of today and tomorrow. AMD's on the same kick with the help of their ATi aquisition.
The end result of this is that nVidia's whole chipset division is gone. POOF! No more. Now they are back to graphics. Just graphics. In a world where the competition offers graphics, chipsets, processors, and even more! To make matters worse they were super late with their latest generation of graphics chips, and they aren't as good as they ought to have been. Meanwhile, AMD has the old ATi graphics division working better and tighter than ever before. nVida who only has graphics isn't even dominant at it right now to be honest.
"Graphzilla" has seen better days.
Aaaaand, I could go on from here but I think I'll give it a rest here for tonight. Don't worry, I'm sure I'll dump another textwall on you all later continuing this just in case anyone actually cares to read it all.
So I may as well continue this whole nvida kick I had going huh? Well certainly.
I guess I'll start by saying I lied a little at the end of the last one. nVidia has more than graphics. See, they saw the need to diversify a while back. If you've looked into nvida marketing in the past couple few years you may have heard the term "cuda". This is sort of a programming language thing for their graphics cards. Y'see, they saw an opportunity to get into the High Performance Computing market (HPC) or in layman's terms supercomputing. They want to put graphics cards into supercomputers and have them crunch serious numbers.
In order to do this they've had to make changes in how they make their graphics cards. They have special bits for the supercomputer guys. This contributed to the lateness and power hungryness of their flagship graphics cards. They have the gtx 460 model now too which uses a smaller chip that has some of that super stuffs removed and is more focused on graphics. The result is a significantly more competitive model as reviews suggest, even if it's more in the midrange. I kinda shows the cost of this compute focus though. So...if you're a fan of nVidia, I'm sorry, but they may not be quite so competitive with flagship graphics in the future. At least their midrange may continue to be a good deal.
A precedence has been set and nVidia is working to cash in on some serious groundwork they've laid to hit a market where billions of dollars are at stake. Tons of potential for huge succes for nVidia here. So...naturally SOMEONE has to rain on their parade. Once again Intel comes into play with a little thing called "Knight's Ferry". This is a future product from Intel aimed right at this same HPC market that nVidia has been preparing to claim for several years now.
If you must know, Intel not so long ago was aiming to produce a high performance graphics card product of it's own. They wanted graphics that don't suck. This was project was titled "larrabee" and was something they talked up a whole bunch. It was going to be based on x86 and thus be like a big CPU with a ton of cores and could be totally programmable so it could basically run software rendering engines superfast instead of having to use directX or openGL to do 3D stuff, though it could do those too. It was going to change the way we did game graphics.
Naturally larrabee didn't pan out. It got delayed, and then outright canceled. Simply put, it was really power hungry and simply too slow to compete against nVidia and ATi. Intel just didn't want to embarrass themselves after talking so big. However, the larrabee project got merged with another project and resulted in this knight's ferry product. Which itself is only a harbinger of bigger products to come. See, one thing about the old larrabee disaster was that it would have actually been pretty darn potent for these supercomputers. It could fill a similar role to these graphics cards from nVidia. For some it'd be even better.
So it seems nVidia's still butting heads with Intel here, but at least it's a fair and competitive field this time, even if it's Intel's turf. nVidia managed to change the rules of the game though, so they have good odds. And Intel's already in hot water with the refs for cheating what with all those monopoly convictions in other countries and being under scrutiny in the US.
And well, I think I've covered this whole boat. Perhaps I'll talk about mobile stuffs next time. For now I need sleep an' I'm not sure how much sense I'm making any more.
You're making fine sense, I only saw one typo. "I kinda shows the cost of this compute focus though." Course I don't pay attention to typos so long as the premise is understandable, but still.
Sooooo, apparently they are making fun of their own name now?
Marketing for big PC hardware companies does get pretty strange sometimes. I mean there was that whole "get a geek" thing from AMD as well.
They like to try and use sex-appeal to sell their products sometimes. I find the result is pretty much always pretty corny. I've often wondered if they actually know how ironic it is and overplay it on purpose to make it funny.
This is neat. I think I'm gonna let a different guy talk about stuff for a bit. LCD panels specifically. He tears apart a common TN+film LCD with LED lighting.
I don't chatter enough about tech stuff, but I think I'm gonna bring up something more to do with business politics and ethics and such. Something others here might have thoughts about?
Sony. I...have concerns about Sony as a company. One of my favored tech news sites periodically posts polls. One such poll featured the question "Which is the most evil company?" or some such variation. The list of options was limited to Apple, Google, and Microsoft. I'm pretty sure Apple "won" that one, but I don't really feel like going into that one. I bring this up because not a small number of people posted comments complaining about the lack of Sony being on the list. Fine. Whatever. I don't feel like trying to argue that one way or the other. That was months ago.
So. Since then the PS3 had it's support for Linux stripped away. As you can imagine this was not a popular update. Soon after the PS3 got hacked into and methods of taking control of the hardware went public. Naturally this power could be abused to pirate games. Naturally, Sony was not happy with this. Not one bit. They decided to sue one of the main guys responsible. This lawsuit was settled out of court, perhaps sadly. It could have set a precedent if such a thing was not already set. See, it's officially legal to jailbreak an iPhone. To me this is the same thing. Sony would have, or at least SHOULD have failed in that in an ideal and consistent world, but the guy wasn't a dick about it and didn't push them to play it all the way out.
Still, perhaps there was some response to all this. See, There was this thing that some of you might have heard about wherein the Playstation Network got hacked something fierce and was offline? Yeah. Hacked, stolen data, hacked again with MORE stolen data, Hacked yet again in Greece recently and AGAIN with apparently the exact same attack as in Greece only in Japan.
So yeah. Sony has been getting hacked so bad it hurts. I find this reprehensible. These hackers are fucking with peoples' data. This isn't just sticking it to Sony. This is doing real harm to people who are not affiliated with Sony. It's terrorism. Cyberterrorism? They are trying to scare people away from Sony.
There is some rather deep thoughts to be thinking about this. I've seen a lot of jokes. A lot of focus on Sony being stupid and the like. I see articles like this and I'm just not sure what to think.
Sony has a lot silly "standards" out there like HDMI and Blu-ray. It might actually be legitimately spendy if that ship sunk.
I believe that was a hack on Taiwan similar to the Greece/Japan hacks as well.
I'm not sure any of this would actually "take down" Sony, it might cause them to reconsider doing anything in gaming pretty much ever again. Also, on the hacker's side of things, if this is to tell Sony to not freak out about people jailbreaking their products, you have failed, miserably. If anything this will tell Sony they were entirely correct to not let anyone jailbreak/hack/do anything with anything, as it compromises their networks.
On the plus side, these attacks will pretty much make Sony the most secure company ever.
That said, this bullshit needs to stop; I am NOT spending 300 bucks on a new console because some stupid idiots can't talk their problems out.
You can't talk your problems out with hackers. I'm not sure if and when this will stop and I'm only speculating as to why, so I could be wrong.
You are probably right about Sony not dying though. That's a worst case scenario. But this will almost certainly have a strong negative impact on Sony. Getting out of gaming isn't a clear fix, nor a good one for consumers though. The hacks aren't focusing on gaming-centric Sony branches, so there's no clear indication that it's specifically an issue with that.
I do not believe this will push Sony to be super secure though. At least not the most secure. The likes of Microsoft and Google are no strangers to security threats. The bar for being the best is pretty high.
What defines an evil company though? I think a lot of the anger at some of those companies is directed at mere annoyance at a company's strong influence over certain areas of modern culture. Apple is evil because it makes hipsters spend twice as much money on computers because it has an Apple logo, Microsoft is evil because it makes everyone use Windows and it's teh suxorz, and Sony is evil because their customer data protection sucks.
I don't really buy it.
It's one thing to agree that in Sony's case that it's a dick thing to do to sue people trying to jailbreak it and open the door for pirates. However, it's another thing to argue that their actions are independent from other businesses both in their industry and in other similar industries as well. This is how businesses are supposed to operate, sadly. They exist to make money. If there is a threat to making money, they perform a cost/benefit analysis. If the threat of jailbreaking and freeware would result in very minimum projected losses, Sony wouldn't do anything if the legal fees would be greater. However, handling the legal fees is the cheaper option according to their analysts, so they sue. It sucks, but most other companies will perform the same way. It's not the company, it's the entire system that's broken.
Well I've been meaning to talk about a couple things like the Intel Apple Thunderbolt thing or SSD's and the general future of such technologies, but I got distracted by a curious thing today.
So...Google apparently is buying Motorola. I'm not sure what to make of that one. One step closer to taking over the world I suppose?
More tech things that it occurs to me would be good stuff to spread.
There was an article on a favored tech site of mine that was considering potential inaccuracies of the old Frames Per Second measurement that is such a standard with games. In it, the author chose to examine individual frame render times and looked into performance within a single second.
He goes through some number shuffling and eventually comes to some interesting results when working with multi-GPU solutions. A thing people refer to as "microstuttering" comes to light.
That's not really the specific focus of the article though. There are other things to notice and consider. I find it a good read. I think it adds more weight to my overall dislike of multi GPU setups though.
Some days I wish I had more to talk about in this thread. Sure, AMD launched some new process. Yeah, there are things about them that I am happy to see were done quite well. At the same time I had some misgivings about them that came true on a deeper level than I would have liked. Overall, these are not the processors you are looking for. The market for processors is being increasingly dominated by Intel and it concerns me quite a bit.
I guess there is stuff with SSD's. I mean I've had concerns about those losing longevity as manufacturing advancements are made. If their lifespans decrease with each process refinement to the manufacturing of the flash memory, it doesn't bode well for ssds as we have now in the long term. However there is talk of memristor technology doing well and could be put to commercial use in a couple years. New tech may save the day there and lead to some very interesting devices in the future.
Maybe I could talk about Windows 8. It's quite the curiosity. It could change the game for the tablet market. I can almost guarantee that x86 tablets will become a new big thing because of it. The real question is the bits of integration with windows phones will help boost the Windows Phone market and also if Win8 will see any success in the ARM tablet market now dominated by the iPad and various Android devices. It won't have the traditional desktop and all the legacy programs to help it out on ARM tablets.
Comments
EDIT: read about it. An extra 1MB of cache and Hyperthreading don't sound like bargains for $50 to be honest.
Naturally people will complain about paying extra to allow something they bought to function at its full potential. Articles like this one are loaded with commentary.
Especially when the material costs are in the fractions-of-cents range.
Just be glad it isn't a monthly fee to keep the thing running.
So, anyway, nVida is hosting a "GPU Technology Conference" this week. A nice follow-up after Intel's big show which finished up just a short while ago. It's got me thinking about nVidia. And not just graphics cards. They do more than graphics or maybe I should say "did".
See not so very long ago they also made chipsets. These are the chips that go on motherboards that would be the central hub that would sort of connect all the stuff together and make it all work. It sort of connects everything up to the processor. If the processor is the brain, the chipset is the nervous/circulatory system(s) if you can follow that analogy. Well, anyway, nVidia used to make these. They made some for AMD and they made some for Intel.
Well, in order to do this for Intel they need a license to produce a chipset that uses Intel's proprietary processor bus (AMD has an open bus 'cause they are all nice and shit). They got such a license from Intel for the Pentium 4's and the Core 2's, and even the lowly Atom processors, but lo and behold Intel decided that with the "Core i7/5/3" processors, they were going to use a different bus. Another new proprietary bus. Naturally they decided that nVida's bus license did not apply to this new bus.
To be clear nVida got this license through a cross-licensing agreement that gave Intel access to some nVidia tech and now Intel decided that they didn't like that bus anymore and would stop using it. Yet, legally Intel still gets nVidia goodies. Awesome deal, huh? Even more curious is that Intel has not given bus licenses out at all this time.
It used to be there were several companies making Intel chipsets years ago and now Intel is the sole provider. People always used to favor Intel's own chipset with their processors, but now it's not even a choice, ya know? I dunno, this kinda irks me a bit, but I'll live. They've shifted half the functionality of the chipsets directly into the processor now anyway. The northbridge portion is basically gone. Swallowed up by the processors of today and tomorrow. AMD's on the same kick with the help of their ATi aquisition.
The end result of this is that nVidia's whole chipset division is gone. POOF! No more. Now they are back to graphics. Just graphics. In a world where the competition offers graphics, chipsets, processors, and even more! To make matters worse they were super late with their latest generation of graphics chips, and they aren't as good as they ought to have been. Meanwhile, AMD has the old ATi graphics division working better and tighter than ever before. nVida who only has graphics isn't even dominant at it right now to be honest.
"Graphzilla" has seen better days.
Aaaaand, I could go on from here but I think I'll give it a rest here for tonight. Don't worry, I'm sure I'll dump another textwall on you all later continuing this just in case anyone actually cares to read it all.
I guess I'll start by saying I lied a little at the end of the last one. nVidia has more than graphics. See, they saw the need to diversify a while back. If you've looked into nvida marketing in the past couple few years you may have heard the term "cuda". This is sort of a programming language thing for their graphics cards. Y'see, they saw an opportunity to get into the High Performance Computing market (HPC) or in layman's terms supercomputing. They want to put graphics cards into supercomputers and have them crunch serious numbers.
In order to do this they've had to make changes in how they make their graphics cards. They have special bits for the supercomputer guys. This contributed to the lateness and power hungryness of their flagship graphics cards. They have the gtx 460 model now too which uses a smaller chip that has some of that super stuffs removed and is more focused on graphics. The result is a significantly more competitive model as reviews suggest, even if it's more in the midrange. I kinda shows the cost of this compute focus though. So...if you're a fan of nVidia, I'm sorry, but they may not be quite so competitive with flagship graphics in the future. At least their midrange may continue to be a good deal.
A precedence has been set and nVidia is working to cash in on some serious groundwork they've laid to hit a market where billions of dollars are at stake. Tons of potential for huge succes for nVidia here. So...naturally SOMEONE has to rain on their parade. Once again Intel comes into play with a little thing called "Knight's Ferry". This is a future product from Intel aimed right at this same HPC market that nVidia has been preparing to claim for several years now.
If you must know, Intel not so long ago was aiming to produce a high performance graphics card product of it's own. They wanted graphics that don't suck. This was project was titled "larrabee" and was something they talked up a whole bunch. It was going to be based on x86 and thus be like a big CPU with a ton of cores and could be totally programmable so it could basically run software rendering engines superfast instead of having to use directX or openGL to do 3D stuff, though it could do those too. It was going to change the way we did game graphics.
Naturally larrabee didn't pan out. It got delayed, and then outright canceled. Simply put, it was really power hungry and simply too slow to compete against nVidia and ATi. Intel just didn't want to embarrass themselves after talking so big. However, the larrabee project got merged with another project and resulted in this knight's ferry product. Which itself is only a harbinger of bigger products to come. See, one thing about the old larrabee disaster was that it would have actually been pretty darn potent for these supercomputers. It could fill a similar role to these graphics cards from nVidia. For some it'd be even better.
So it seems nVidia's still butting heads with Intel here, but at least it's a fair and competitive field this time, even if it's Intel's turf. nVidia managed to change the rules of the game though, so they have good odds. And Intel's already in hot water with the refs for cheating what with all those monopoly convictions in other countries and being under scrutiny in the US.
And well, I think I've covered this whole boat. Perhaps I'll talk about mobile stuffs next time. For now I need sleep an' I'm not sure how much sense I'm making any more.
Whatever. My mind wandered off. I'll probably get back to it soon enough. For now:
Sooooo, apparently they are making fun of their own name now?
Marketing for big PC hardware companies does get pretty strange sometimes. I mean there was that whole "get a geek" thing from AMD as well.
They like to try and use sex-appeal to sell their products sometimes. I find the result is pretty much always pretty corny. I've often wondered if they actually know how ironic it is and overplay it on purpose to make it funny.
Sony. I...have concerns about Sony as a company. One of my favored tech news sites periodically posts polls. One such poll featured the question "Which is the most evil company?" or some such variation. The list of options was limited to Apple, Google, and Microsoft. I'm pretty sure Apple "won" that one, but I don't really feel like going into that one. I bring this up because not a small number of people posted comments complaining about the lack of Sony being on the list. Fine. Whatever. I don't feel like trying to argue that one way or the other. That was months ago.
So. Since then the PS3 had it's support for Linux stripped away. As you can imagine this was not a popular update. Soon after the PS3 got hacked into and methods of taking control of the hardware went public. Naturally this power could be abused to pirate games. Naturally, Sony was not happy with this. Not one bit. They decided to sue one of the main guys responsible. This lawsuit was settled out of court, perhaps sadly. It could have set a precedent if such a thing was not already set. See, it's officially legal to jailbreak an iPhone. To me this is the same thing. Sony would have, or at least SHOULD have failed in that in an ideal and consistent world, but the guy wasn't a dick about it and didn't push them to play it all the way out.
Still, perhaps there was some response to all this. See, There was this thing that some of you might have heard about wherein the Playstation Network got hacked something fierce and was offline? Yeah. Hacked, stolen data, hacked again with MORE stolen data, Hacked yet again in Greece recently and AGAIN with apparently the exact same attack as in Greece only in Japan.
So yeah. Sony has been getting hacked so bad it hurts. I find this reprehensible. These hackers are fucking with peoples' data. This isn't just sticking it to Sony. This is doing real harm to people who are not affiliated with Sony. It's terrorism. Cyberterrorism? They are trying to scare people away from Sony.
There is some rather deep thoughts to be thinking about this. I've seen a lot of jokes. A lot of focus on Sony being stupid and the like. I see articles like this and I'm just not sure what to think.
Sony has a lot silly "standards" out there like HDMI and Blu-ray. It might actually be legitimately spendy if that ship sunk.
I'm not sure any of this would actually "take down" Sony, it might cause them to reconsider doing anything in gaming pretty much ever again. Also, on the hacker's side of things, if this is to tell Sony to not freak out about people jailbreaking their products, you have failed, miserably. If anything this will tell Sony they were entirely correct to not let anyone jailbreak/hack/do anything with anything, as it compromises their networks.
On the plus side, these attacks will pretty much make Sony the most secure company ever.
That said, this bullshit needs to stop; I am NOT spending 300 bucks on a new console because some stupid idiots can't talk their problems out.
You are probably right about Sony not dying though. That's a worst case scenario. But this will almost certainly have a strong negative impact on Sony. Getting out of gaming isn't a clear fix, nor a good one for consumers though. The hacks aren't focusing on gaming-centric Sony branches, so there's no clear indication that it's specifically an issue with that.
I do not believe this will push Sony to be super secure though. At least not the most secure. The likes of Microsoft and Google are no strangers to security threats. The bar for being the best is pretty high.
I don't really buy it.
It's one thing to agree that in Sony's case that it's a dick thing to do to sue people trying to jailbreak it and open the door for pirates. However, it's another thing to argue that their actions are independent from other businesses both in their industry and in other similar industries as well. This is how businesses are supposed to operate, sadly. They exist to make money. If there is a threat to making money, they perform a cost/benefit analysis. If the threat of jailbreaking and freeware would result in very minimum projected losses, Sony wouldn't do anything if the legal fees would be greater. However, handling the legal fees is the cheaper option according to their analysts, so they sue. It sucks, but most other companies will perform the same way. It's not the company, it's the entire system that's broken.
I meant the hackers should talk their problems out to Sony; but I guess you can't talk out problems with corporations either.
So...Google apparently is buying Motorola. I'm not sure what to make of that one. One step closer to taking over the world I suppose?
There was an article on a favored tech site of mine that was considering potential inaccuracies of the old Frames Per Second measurement that is such a standard with games. In it, the author chose to examine individual frame render times and looked into performance within a single second.
He goes through some number shuffling and eventually comes to some interesting results when working with multi-GPU solutions. A thing people refer to as "microstuttering" comes to light.
That's not really the specific focus of the article though. There are other things to notice and consider. I find it a good read. I think it adds more weight to my overall dislike of multi GPU setups though.
I guess there is stuff with SSD's. I mean I've had concerns about those losing longevity as manufacturing advancements are made. If their lifespans decrease with each process refinement to the manufacturing of the flash memory, it doesn't bode well for ssds as we have now in the long term. However there is talk of memristor technology doing well and could be put to commercial use in a couple years. New tech may save the day there and lead to some very interesting devices in the future.
Maybe I could talk about Windows 8. It's quite the curiosity. It could change the game for the tablet market. I can almost guarantee that x86 tablets will become a new big thing because of it. The real question is the bits of integration with windows phones will help boost the Windows Phone market and also if Win8 will see any success in the ARM tablet market now dominated by the iPad and various Android devices. It won't have the traditional desktop and all the legacy programs to help it out on ARM tablets.