In which X'o'Lore talks too much about random tech things

1356

Comments

  • edited April 2012
    Pardon, but it's been quiet on the boards so I'm just gonna write a bunch of stuff...

    Well at this point Intel has launched Ivy Bridge. It's a little faster than Sandy Bridge at the same clock speeds which is good because It's pretty much given to us at the same sort of clock speeds. The bigger notes are the fair boost to performance in the integrated graphics which have gained some good ground on AMD's Llano, though it's still slower. Let's temporarily ignore that AMD will launch Trinity soon to replace Llano and congratulate Intel on showing some semblance of initiative with graphics. Oh, notably Ivy Bridge brings lower peak power consumption compared to Sandy Bridge. Idle power is the same at best. Hard to improve that when it was already so low I guess. Motherboards and graphics cards are a much bigger impact on idle power with cpus like these.

    On the graphics front nVidia is making a big deal about a new product they'll showcase on Saturday at some event. Everyone who knows anything is pretty much assuming it'll be the Geforce GTX 690, their unpcoming dual GPU card. Good for them on that. It's the one graphics segment AMD hasn't launched new products at yet.

    I kinda wish nVidia would be quicker about launching lower end offerings than the 680 which is all they got for now. I certainly did not expect nVidia to throw compute performance under the bus in favor of making a card that is faster and more efficient for gaming than AMD's finest and launching at a lower price. Such efficiency bodes very well for their 690. Big props to them and getting a memory controller to clock that high especially. This leaves me curious about the rumored "big kepler" that probably exists but I'm guessing will be limited at least at first to the tesla products for big compute customers. Anyway, I like nVidia's current direction in terms of hardware and I'm really wondering about products cheaper than $500 and hopefully better availability than "virtually none".

    'course AMD has had a whole new lineup of cards for a little while. Decent stuff. I was a bit worried about them launching a completely new architecture on a new process, but time has shown they largely pulled it off. The performance of AMD's new top rung is quite good but not quite incredible and it's power efficiency is maybe not what it could be but it does idle very nicely at least even if the load power is a bit higher than I'd like to see. Still, after nVidia's issues with Fermi, I hadn't dared expect too much from them building a more compute-friendly architecture.
    Pricing was definitely high though. Luckily they recently cut some prices, so that's improving. Overall, AMD is doing well with graphics. Their new finest doesn't quite compete with nVidia's masterpiece, but it's a very new architecture that very likely has room for refinement and more notably it's available. AMD loses the short term crown for not having the fastest, most efficient chips, but they win the business side for actually having sufficient product out where consumers at many levels can make purchases.
  • edited April 2012
    Yay! More learnings!
  • edited December 2012
    This is kind of neat.

    A combination of price cuts, game deals, and fairly recent driver updates seem to have improved to positioning for current AMD cards, but more interesting is seeing the comparison to all the somewhat older cards. It'd be kinda neat to see some modern IGP comparisons thrown in there, but this is already pretty extensive.
  • edited December 2012
    They have my HD 4890! It's still not a bad card, even after several years. I'm still running most of what I want, it's just stupid loud and a power hog.
  • edited December 2012
    mac. :( this article just makes me sad to be an Apple person, but I maintain that in my lifestyle, the software benefits outweigh the hardware downsides. plus... let's be honest... these spec sheets rate games on my card at 42+ fps... tv is 30, max we can percieve is about 70... I'lll live with being in between.
  • edited February 2013
    I never seems to get around to posting much here...probably because my crazy speculation becomes taxing if I attempt to write out my thoughts. It gets long. Instead I'm just going to leave a video here. One Mr. Newell talking about interesting stuff. This is also pretty long, so settle in if you're gonna watch it.

  • edited February 2013
    I like reading long interesting posts though.
  • edited February 2013
    Well Intel's Haswell is coming out this summer on it's fancy 22nm process. They will follow that with a Broadwell shrink/update on their upcoming 14nm process in 2014 (probably late...at least Q3 if not Q4 if I were a betting man). THAT is supposed to be followed by...Sky Lake or something? Which is supposed to be a new architecture thingy on 14nm. I wouldn't flinch a bit if that slips as far out as 2016.

    AMD has Piledriver which comparatively stinks on it's 32nm process. They are working toward Steamroller which is supposed to boost the single thread execution significantly over Piledriver which is often cited as the biggest weakness of Bulldozer/Piledriver. This will be on 28nm supposedly and maybe from TSMC (Sorry Global Foundries...I guess you suck?)
    Steamroller being pushed back generically to "2014" is gonna be late and 28nm is not very aggressive for that timeframe but AMD also talked of some new logic libraries that lay out the transistors much tighter and functionally give them some benefits as if from a smaller process. Also they have been working on TSMC 28nm for over a year with the Radeons. Much experience and maturity here so things may not be all bad.

    Stuff to think on but the bigger points of interest in the competitive landscape for consideration in my opinion is AMD's Kaveri that is their future higher end APU combining Steamroller CPU cores with new GCN graphics cores. An interesting feature on this thing is that it will allow the CPU and GPU to work on data within the same memory space. In peak terms this isn't really a big deal since their current APUs can move data between the CPU and GPU with minimal performance hit since it's just moving it around slightly in the same chunk of memory, but with Kaveri, the programmers won't have to move it at all if I understand this correctly. THAT will make GPU compute a good bit simpler. Furthermore it sets a stage in my mind for the CPU and GPU to share compute resources more. This means that maybe the follow up to Kaveri perhaps in 2015 sometime might actually be able to incorporate some automatic GPU acceleration to generic code. If something like that does indeed happen and AMD still is behind with CPU performance, a thing like that could make up a lot of ground really fast. Intel will still be sitting on Broadwell. That won't likely be able to do any similar sort of Heterogeneous witchery to that degree.

    If this is indeed the path AMD is taking...They may have a chance of doing something awesome in the future. At least if the new CEO doesn't have the company burned to the ground by then...

    I left out...quite a lot of details that passed through my mind in this particular thought train. Quite a lot indeed. But I think I got the highlights?
  • edited February 2013
    YAY! More long posts!

    I'd respond more often to these. Except I never have anything to say. They are generally fun/interesting to read though.
  • edited February 2013
    Sounds techy and beyond my understanding of computers and engineering. Guess I'll just wait a few years and find out.
  • edited February 2013
    Smaller processes = more processes in the space allotted = faster processor

    One assumes anyway.
  • edited February 2013
    Not really faster actually. Processors and stuff are made on disks of silicon material (referred to as wafers) and a smaller process basically just etches the processor patterns smaller so they get more of them on a disk. Since everything is tighter together they can usually lower voltages to cut back power consumption a bit too. More speed is more about good design and more transistors to work with on top of general additional advancements they might incorporate into a new process.

    Of course as time goes on, these new processes are getting so small that the current has a bad tendency to leak out of the transistors which counters some of the power benefits, and although they get more processors per wafer, the wafer prices tend to be higher on smaller processes so that's kind of a balance point as well. It's a tricky business and it's not getting any easier.

    But yeah. Suffice to say the things I like to waste time pondering are pretty esoteric. I'm not in the habit of talking about stuff.
  • edited February 2013
    Right, though I figured I'd keep it simpler. And that saying "Better Processor" didn't really explain much.

    Why I no good at words!?
  • edited February 2013
    I suppose it makes sense to leave this here since I was just sorta talking about it slightly.

    Chipmaking stuffs.
  • edited February 2013
    Not that this is a big deal to anybody else, but...aw crap.
  • edited February 2013
    Like... I guess it makes sense, what with web developers and their habits of being too lazy to building things for each set of software. But... "Let's drop our more competent software and become a clone of Google Chrome" still seems dumb as a concept.
  • edited February 2013
    Well, the writing was on the wall after the switch on mobile, really.

    It's kind of a pain in the arse that in 2013 we still need to deal with shit like proprietary rendering engine features causing hassle, though I can see Opera's logic. Webkit is the biggest rendering engine in terms of use over all platforms, and if any specific rendering optimisation is done, it will be aimed at either Webkit, Gecko or Trident. Opera, while being an impressive browser, is pretty much an also-ran in terms of rendering engine awareness amongst web devs.

    Also, Azrodal, switching to the rendering engine of Chrome doesn't mean Opera will become Chrome with a different logo. It will doubtless still maintain its current features/look and feel, but will just render pages in the same way Chrome does.
  • edited February 2013
    Oh I know. But it's the thought that counts.
  • edited February 2013
    It's mostly just sad. They put a lot of work into Presto and it'd be a great layout engine...if developers actually coded to the standards. Don't blame propriety engines for hassles. Webkit and Gecko have been made to support awful code due to being open sourced. Trident is forever blamed for being "bad" even if it does support standards and everyone just ignores Presto and any other engine. People code for Webkit because it lets them practically get away with metaphorical murder in code.
  • edited February 2013
    Chrome is lame. Everything I make breaks in it, somehow. Chrome is to me what IE is to everyone else. And then there is dear, sweet Firefox. Though I use Pale Moon, which is an optimized version of Firefox so it doesn't have the sluggishness everyone always seems to complain about.
  • edited February 2013
    Yeah...my last statement was probably wrong in some ways and I can't back it. I woke up with a headache and was not too happy this morning.

    Anyway...I'm not going to disagree about Chrome, but I haven't had issues like that yet, but I'm not exactly entrenched in complex web development at this point so we'll see. I don't like Chrome for a variety of reasons that have nothing to do with it's rendering or compatibility.

    Of course I'm not a big Google fan in general these days so meh. I may have to actually check Palemoon out. FF is my secondary browser as it is and I use it a fair bit.
  • edited February 2013
    I use generic Firefox with add-ons and I too do not have the sluggishness people complain about, I think people might just suck.

    I do have one problem actually, whenever I download something if it was the first thing I downloaded during my current instance of Firefox it stops responding for around 30 seconds. Only the first download though, any and all after that work fine.
  • edited February 2013
    My problem with Firefox is it stopped telling me when it needs updating or auto-updating. So when I start having problems with it I check and it's like 4 version ahead of the one I have. And nothing I do will get it to tell me when it wants to update again. Even the "Restart to update" button in the about section just restarts it with no change at all.
  • edited February 2013
    Huh. Mine usually gives a little popup saying a new update is available, update now or wait till later.

    But I heard Firefox was considering doing like Google Chrome and ingraining itself in your computer's processes so it can automatically update (another thing that I don't like about Chrome, it takes many steps to get the thing completely off your computer, and my darn antivirus installed Chrome for me automatically without my permission).
  • edited February 2013
    Yeah, mine used to give me said update box. But it stopped a few months ago or something. I don't know.
  • edited March 2013
    The little fan in my graphics card has been going bad as of late and making a completely inappropriate amount of noise for it's given rotational speed so I done removed it from the heatsink. In it's place I've strapped a 120mm fan awkwardly in it's place and found that it actually lowered temperatures. Actually it lowered them quite significantly. Then I realized my graphics drivers were reporting even lower temps and I've determined that the utility I was using was probably getting the temp scaling wrong and was showing temps that were higher than was true and became ever more disproportionate as temps increased.

    So...now I'm shuffling fans to try to find a quieter one to stick to the side of my GPU and considering eliminating my case fan. The things I do when I find time to kill...
  • edited March 2013
    HEY! Firefox is telling me when it needs updated again. Woot.

    On the other hand... The cheap capacitors in my monitor appear to be dieing. Though I wanted a better monitor anyways, so I guess I'll just buy a new one (or two) when I get some money. Highly doubtful, but there any monitors on the market that are either super cheap or rather good? And what's the advantages of an OLED over an LED or an LCD?
  • edited March 2013
    Rather obnoxiously, It won't let me view the thread without registering. Which for whatever reason costs money...
  • edited March 2013
    Ah, that's annoying, they're going through one of their no access to non-member periods at the moment.