Comments Locked

38 Comments

Back to Article

  • JarredWalton - Wednesday, September 11, 2013 - link

    I wonder how much power "everything else" in a laptop consumes. I've always figured the lion's share was going to the LCD (at least on a good design that knows how to power down inactive chips). Assuming we don't significantly reduce display power draw, I think we're getting quite close to the point where making the CPUs more power efficient won't help tremendously -- on laptops at least.

    I ran some numbers recently, and figured that a 13.3" Haswell ULT Ultrabook idles (with the screen at 200 nits) at <5W. Of that 5W, probably only .5W-1.0W is going to the CPU core, 2-3W to the LCD, and the rest to the motherboard and other devices. Reducing the CPU power use to 0W would only cut total power use by another 25% at best.

    Which is another way of saying: this is why Intel is showing power reduction for the SoC/CPU, and why it's using a load scenario. Reducing load power is also great, don't get me wrong, but I think we're close to the point where idle/light usage will get 12-15 hours of battery life and not much more. Now give me "heavy use" battery life of 5+ hours (e.g. while gaming) and that would be awesome. :-)

    Oh, final note: I've got a Sony VAIO Pro 13 in house (finally!). Running the Light battery test right now. I started it last night, and 12 hours later (with the extra sheet battery) I've still got over 6 hours to go. Whew!
  • yelped - Wednesday, September 11, 2013 - link

    Interesting.
  • The Von Matrices - Wednesday, September 11, 2013 - link

    Is the Broadwell SOC still dual-core only or will a quad core version be offered?
  • Kevin G - Wednesday, September 11, 2013 - link

    Should this be testable? You'd have to crack open a laptop and disconnect the internal cables to the LCD but that's doable on most. Then attach an external monitor and use that for testing while measuring the power consumption of the laptop chassis.

    As for other system components, most can be removed if they're not ultrabooks. Hard drives and optical drives can be removed or powered externally. Chances are that the internal fan could be powered externally as well. It is really dependent on how much you want to hack the device.
  • Darkstone - Wednesday, September 11, 2013 - link

    Connecting an external display and measuring the power draw won't help. I have seen numerous examples where attaching an external display actually dropped the power consumption. The display was feeding power to the motherboard via the VGA connector. You have to be very careful when doing measurements with multiple devices attached.

    But is is possible to measure the power usage of the display backlight. I did some measurements myself resulting in this graph:
    http://tweakers.net/ext/f/ZwE3I5daWcnM6jLMXKbXSoSI...
    The X asis is the level of the display brightness where 0 is off. The Y axis is the discharge rate as reported by the battery in Watt. Remember that the difference between 'power off' and 'lowest brightness setting' includes the power consumption of the graphics card and display logic. The measurements where done using a script that randomly changed the display backlight (but not the 'off' data point) over the night. The battery (97Wh) lasted 9 hours. You can see that even with my moderately specced latitude, the display backlight contributes to about half of the total power consumption. Even with the backlight at about the middle setting.

    Specs at time of measurement: latitude E6520, i5-2520m, LTN156HT01-101 display (15.6", 1080p, matte), intel X25-M (LPM enabled).
  • deepi - Wednesday, September 11, 2013 - link

    I want to buy a SVP 13, however I'm rather annoyed with Sony that the wifi isnt wifi AC. I'm scared Im not going to have enough bandwidth when transferring my 80GB Virtual machines around.
  • Pirks - Wednesday, September 11, 2013 - link

    Haha, I use USB3-to-Gigabit Ethernet dongle for that in my SVP 13 :P Good luck and don't pay attention to that AC marketing drap, with this dongle you won't need it!
  • deepi - Thursday, September 12, 2013 - link

    Yes very true, but I was a little concerned that doing ethernet over usb will slow the system down by virtue of how USB works i.e. being driven by the processor.

    I'm considering waiting for the 'Broadwell' vaio pro equiv.

    I'm in Europe so no PCIe SSD which is another bummer!

    Also sony have failed to respond to clarify their battery policy, im a little concerned that after a couple of years the battery is only going to be half the capacity I started with and yet theres no clear battery replacement program/policy.
  • mkozakewich - Friday, September 13, 2013 - link

    USB 3.0 should be able to transfer up to 3.2 Gbps, so a gigabit ethernet adapter shouldn't push it too hard. I'm pretty sure the performance of USB 3.0 isn't much better than USB 2.0.

    Any OEMs that use good hardware (I'd expect this of Sony) will use fairly good batteries. If you try not to drain it below 20%, don't leave your computer running 24/7 or sitting in hot temperatures, and maybe use a battery slice (which you could drain completely, seeing as it's cheaper and replaceable), the internal battery should last more than a few years.
    (My Surface Pro is showing 0.0% wear after half a year. My crappy ASUS netbook came with 2.2%, and quickly climbed to 20%.)
  • ben.avellone - Friday, September 13, 2013 - link

    USB 3.0 is 5 Gbit/s, though that is the theoretical peak. However, in all my past experiences 3.0 has been noticeably faster. I often cap out at 30MB/s over 2.0 host ports, the theoretical limit for 2.0. But with 2.0 devices, you'd surely never hit that cap either. With 3.0 devices, you can get better performance out of a 2.0 port than a 2.0 device can!

    tldr;
    USB 3 > 2
  • toncijukic - Wednesday, November 20, 2013 - link

    @deepi, How do you mean "no PCIe SSD in Europe"?
  • tuxRoller - Wednesday, September 11, 2013 - link

    Congrats. Looks like that device is close to macbook air levels.
  • MrSpadge - Wednesday, September 11, 2013 - link

    Still, better power efficiency directly equals better performance in TDP limited use cases, so they must not stop yet. Improving idle power consumption is a different story, of course.
  • Stuka87 - Wednesday, September 11, 2013 - link

    I am thinking once they do hit that "wall" of sorts, they can start raising clock speeds so you end up with better performance without any power increase over the previous gen.

    The last two generations performance has no change much. Just power decreases. So raising the clock speeds while maintaining power consumption will be what we see next.
  • EnzoFX - Wednesday, September 11, 2013 - link

    Uhm, I don't think so. People aren't going to be like, OK now we have enough battery time...we don't need any more...
  • Stuka87 - Wednesday, September 11, 2013 - link

    EnzoFX: So you are saying people rather have 1% better battery life than 25% more performance?

    The rest of the computer uses more power than the CPU, so the CPU being more efficient will have less and less of an impact on battery life.
  • EnzoFX - Thursday, September 12, 2013 - link


    But then the shift will be about power on load, no? Seeing as processors are using less power on idle, but still bordering what's allowed when on load. Performance will go up, as it's expected I would think, but they're not going to increase it just to do it when they have other things to market. I think it ties in to the plateau of performance. Of course I could have no idea what I'm talking about lol.
  • iwod - Thursday, September 12, 2013 - link

    I was thinking and estimate in idle, Haswell is more like 0.3 to 0.5W where the LCD would be 2.5-3.5W instead. Then is another 0.3 - 0.5W for SSD, and 1W for Memory.

    At least Intel is working on Near Threshold Voltage, Memory Cube with even lower power, and some other improvement.

    But in terms of CPU, Hell yes Broadwell is now finally entering ARM's range.
  • ananduser - Thursday, September 12, 2013 - link

    Eagerly awaiting for the SVP 13" review.
  • Hector2 - Thursday, September 12, 2013 - link

    "I think we're getting quite close to the point where making the CPUs more power efficient won't help tremendously"
    Certainly in the tablet & laptop markets, ARM's strength's have been less power & less cost than Intel --- at the expense of not being software compatible and having lower performance. As Intel matches ARM's strengths while retaining their performance advantage, they start pulling ahead of ARM
  • The Von Matrices - Wednesday, September 11, 2013 - link

    The PCH die on the two packages look remarkably similar. I know it's rumored that that about the only thing the 9 series chipsets will have over the 8 series is SATA express, but are they using the same silicon?
  • Flunk - Wednesday, September 11, 2013 - link

    The PCH hasn't really changed much since Intel Integrated the northbridge into the CPU die. The Z68 and Z77 for example are basically identical so it wouldn't be too surprising to see the same thing here.
  • DanNeely - Wednesday, September 11, 2013 - link

    Chipset die sizes have been primarily determined by the number of output pins for years. This was the original motivation behind the IGP (FSB, AGP, Ram, and DMI pinouts needing a larger die than their controller hardware did leaving "free" space on the die for something else). The southbridge has been even more static; since it's still just a collection of IO devices and doesn't contain any major computation units.

    What's interesting to me is that the Broadwell PCH appears slightly larger. It's possible it's just being held slightly closer to the camera (pic's too noisy for me to tell); if not I wonder if SataExpress is the only addition to it. I've been saying for years that if Intel wants to push Thunderbolt below the premium market segment they need to integrate it into their chipsets and am wondering if this might be it.
  • eanazag - Wednesday, September 11, 2013 - link

    This is far more interesting to me than Bay Trail. This is something I might actually buy because the performance is still there. I'm concerned we may not get a performance boost over two years because the power is being cut so dramatically over these two years. I still think it is what Intel needs to do though.
  • Hrel - Wednesday, September 11, 2013 - link

    It's Intel. The step down to 14nm is significant. I wouldn't be surprised at all to see a 30% reduction in power AND a 30% increase in performance. Never underestimate their engineers.
  • DanNeely - Wednesday, September 11, 2013 - link

    Aside from the GPU, which has gotten significantly more die area, Intel's last few die shrinks have offered negligible performance gains.
  • A5 - Wednesday, September 11, 2013 - link

    Intel's been pretty clear that they're worrying primarily about performance per watt more than raw performance since Nehalem, I think. Still a lot of people there bearing the scars of NetBurst.
  • purerice - Saturday, September 14, 2013 - link

    You are right. I remember my Pentium4 laptop... It had a jet engine for a cooling fan and got about 90 minutes of battery life without wifi, 70 minutes with. On that note, wifi decreased battery life by 22%.

    If wifi chip makers (and HD, RAM, etc) did not reduce the power draw of their components as well, we'd see Apple advertise the Macbook Airs as (12 hours with wifi off, or now up to 2 1/2 hours with wifi on)
  • andykins - Wednesday, September 11, 2013 - link

    Erm, no. It's a 30% reduction in power for the SAME CPU performance. I'll eat my hat if Broadwell brings 30% power reduction and 30% increased CPU performance simultaneously. That's not to poo poo Broadwell though, it looks great.
  • MrSpadge - Wednesday, September 11, 2013 - link

    Agreed - this is "just a die shrink", so no way for huge benefits in performance and power draw simultanously.
  • purerice - Saturday, September 14, 2013 - link

    true, but the choice will be there for same performance, 30% less power, 10% more performance, 10% less power, 15% more performance, same power.

    Just making up numbers here, take the 35W TDP Haswell (4765T). Broadwell equivalent would be 24.5W TDP for 4 cores, or perhaps closer to 35W TDP with 6 cores.
  • tipoo - Wednesday, September 11, 2013 - link

    I'd like to see what they'll do with the integrated GPU on 14nm. Should be able to fit a lot more EUs in.
  • MrSpadge - Wednesday, September 11, 2013 - link

    There's not enough main memory bandwidth to feed even Haswells GT3 properly, without Crystal Well.
  • tipoo - Thursday, September 12, 2013 - link

    For the eDRAM version, I meant.
  • djds20 - Wednesday, September 11, 2013 - link

    That's what I'm waiting for.... a Surface Pro 3.
  • jwcalla - Wednesday, September 11, 2013 - link

    Seems like we hear this every year... "This is the one that's really going to get us into mobile!" They must know that Apple is just itching to get their own chips into their MacBooks.
  • extide - Thursday, September 12, 2013 - link

    Well, if anything, that might be interesting. Another CPU performance war, between Intel and Apple? Hrm......
  • chenjf - Tuesday, September 17, 2013 - link

    What do you mean like MacBook Air? MacBook Air all come with Intel processors.
    http://www.apple.com/macbook-air/specs.html

Log in

Don't have an account? Sign up now