0815
Mar 31, 04:16 PM
Interesting ... I was always told by Android Fans that the system is so "open" and not "fragmented" ... hmmm ... looks like google disagrees and admits it is fragmented and that 'closed' is better :D
aaronsullivan
Apr 11, 11:43 AM
To me this means 4G and Verizon/AT&T hardware convergence. Both, good news.
My biggest concern is the next iOS version. Will it be delayed to coincide with the hardware? With little info, I'd guess/hope no. If it's impressive enough it can fight competition using software enhanced iPhone 4 for awhile. Without the big iOS update seems a long stretch to 2012.
Either way, I'll personally be sticking with my iPhone 4 'til late June 2012 anyway for contract reasons.
How about this for the iPhone 5
5 4 3 2 1
iPhone 5, 4G (4 cameras), 3D, 2 carriers, 1 easy choice.
Yeah, that's why I'm not in marketing. :o/
My biggest concern is the next iOS version. Will it be delayed to coincide with the hardware? With little info, I'd guess/hope no. If it's impressive enough it can fight competition using software enhanced iPhone 4 for awhile. Without the big iOS update seems a long stretch to 2012.
Either way, I'll personally be sticking with my iPhone 4 'til late June 2012 anyway for contract reasons.
How about this for the iPhone 5
5 4 3 2 1
iPhone 5, 4G (4 cameras), 3D, 2 carriers, 1 easy choice.
Yeah, that's why I'm not in marketing. :o/
Durendal
Apr 5, 07:16 PM
YES!!! (http://www.youtube.com/watch?v=eyqUj3PGHv4)
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhp3j__l3-UJoB8tAoRAfWJEV1EsQJDA2GFag-LGyfxVs_zESEHxu5hXnqQdNxe37i8i7KnAjM3tPSKmYGzuvPvYVz9DgX5lJ380gf8PiQSCRbe7Zdi26T_QjK9HtTcAdQEeK0FZWrN86nR/s400/wtf.jpg
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhp3j__l3-UJoB8tAoRAfWJEV1EsQJDA2GFag-LGyfxVs_zESEHxu5hXnqQdNxe37i8i7KnAjM3tPSKmYGzuvPvYVz9DgX5lJ380gf8PiQSCRbe7Zdi26T_QjK9HtTcAdQEeK0FZWrN86nR/s400/wtf.jpg
PhantomPumpkin
Apr 25, 04:29 PM
You aren't being tracked by Apple, you aren't being tracked to the meter. You can opt out, just switch off location services.
And by the way even if you do switch off location services your location is still being tracked by the mobile phone companies everytime your phone makes a connection with one of their masts, which happens everytime you move cell. Oh and this happens with every phone, otherwise they wouldn't work.
Stop being a paranoid sheep and start reading the facts of this case not the media hype.
Dig deeper Watson. Turning off location services DOES NOT disable this feature. It is still logged, even with location services off. That's the whole issue the smart people have. There's no way to auto-truncate the file, and there's no way to turn it off.
And by the way even if you do switch off location services your location is still being tracked by the mobile phone companies everytime your phone makes a connection with one of their masts, which happens everytime you move cell. Oh and this happens with every phone, otherwise they wouldn't work.
Stop being a paranoid sheep and start reading the facts of this case not the media hype.
Dig deeper Watson. Turning off location services DOES NOT disable this feature. It is still logged, even with location services off. That's the whole issue the smart people have. There's no way to auto-truncate the file, and there's no way to turn it off.
citizenzen
Apr 28, 04:05 PM
If liberals would stop 'crying wolf' ('claiming racism') at every corner, we might actually take them seriously and help out when there's actual evidence.
Likewise, if conservatives would not turn a blind-eye to obviously something that is racially motivated, we might actually take them seriously.
If there's not enough evidence that the birth certificate issue is racially motivated, then I can't imagine what it would require for something to meet standard.
Likewise, if conservatives would not turn a blind-eye to obviously something that is racially motivated, we might actually take them seriously.
If there's not enough evidence that the birth certificate issue is racially motivated, then I can't imagine what it would require for something to meet standard.
Ktulu
Aug 25, 07:40 PM
My only dealings with Apple Support was a few years ago. On Christmas day the modem on my Pismo went out. I just for a lark called to see if anyone was in and not only was someone there I was taken care of quite nicely. The next day I had a box to send it off and three days later I had it back. Not bad for a notebook that was about two weeks short of the warranty expiring.
I'm not trying to be a wise a@@, but when did Apple make a Pismo. I do remember them, but not being made by Apple. I am sorry, I don't recall the manufactuer for them at this time.:confused:
I'm not trying to be a wise a@@, but when did Apple make a Pismo. I do remember them, but not being made by Apple. I am sorry, I don't recall the manufactuer for them at this time.:confused:
Hellhammer
Apr 8, 09:01 AM
The trouble is .. I find the TDP numbers for Sandy Bridge very misleading. For example the previous i7 2.66Ghz dual core had a TDP of 35W and the current i7 2.2Ghz quad core has a TDP of 45W. Theoretically, it should only use 10W more when doing CPU intensive task, but according to anandtech who measured the task, the i7 Sandy Bridge Quad core was using almost 40W more when running cinebench.
http://www.anandtech.com/show/4205/the-macbook-pro-review-13-and-15-inch-2011-brings-sandy-bridge/14
It just doesn't make any sense. Going by those figures, if the i7 dual core was 35W, the i7 Sandy Bridge quad core would be around 70W.
Not sure how this relates to potential MacBook Air Sandy Bridge processors, but keep in mind.. there must be a reason why Samsung went for the ULV processor in their 13" laptop instead of the LV one.
CPU isn't the only thing that changed. AMD 6750M (~30W) has higher TDP than NVidia GT 330M (~23W). I had to put ~ because their TDPs are not officially stated by AMD or NVidia so it's just based on previous GPUs and their TDPs. The point is that AMD 6750M has higher TDP.
There is also another thing. TDP is not the maximum power draw. Maximum power dissipation is usually 20-30% more than the actual TDP. While MPD is rarely achieved as it requires maximum voltage and temperature, it can (nearly) be achieved with heavy benchmarking applications.
For example, the combined TDP from quad core SB and AMD 6750M is 75W. If we use 20% extra as the MPD, that is 90W, just from the CPU and GPU! Of course those parts are not using 90W in that test because things like screen, HD, RAM etc need power too. As the MPD is usually in percents, it can explain why the difference is so big in watts.
40W sounds a bit too much to explain with MPD though. IIRC the GT 330M is underclocked but I'm not 100% sure. You have a valid point that the SBs may be using more power than their predecessors. To make this more accurate, we should compare them with C2Ds though ;)
I guess we will have to wait and see, but an ULV in 13" would be more than a disappointment.
http://www.anandtech.com/show/4205/the-macbook-pro-review-13-and-15-inch-2011-brings-sandy-bridge/14
It just doesn't make any sense. Going by those figures, if the i7 dual core was 35W, the i7 Sandy Bridge quad core would be around 70W.
Not sure how this relates to potential MacBook Air Sandy Bridge processors, but keep in mind.. there must be a reason why Samsung went for the ULV processor in their 13" laptop instead of the LV one.
CPU isn't the only thing that changed. AMD 6750M (~30W) has higher TDP than NVidia GT 330M (~23W). I had to put ~ because their TDPs are not officially stated by AMD or NVidia so it's just based on previous GPUs and their TDPs. The point is that AMD 6750M has higher TDP.
There is also another thing. TDP is not the maximum power draw. Maximum power dissipation is usually 20-30% more than the actual TDP. While MPD is rarely achieved as it requires maximum voltage and temperature, it can (nearly) be achieved with heavy benchmarking applications.
For example, the combined TDP from quad core SB and AMD 6750M is 75W. If we use 20% extra as the MPD, that is 90W, just from the CPU and GPU! Of course those parts are not using 90W in that test because things like screen, HD, RAM etc need power too. As the MPD is usually in percents, it can explain why the difference is so big in watts.
40W sounds a bit too much to explain with MPD though. IIRC the GT 330M is underclocked but I'm not 100% sure. You have a valid point that the SBs may be using more power than their predecessors. To make this more accurate, we should compare them with C2Ds though ;)
I guess we will have to wait and see, but an ULV in 13" would be more than a disappointment.
heels98
Sep 19, 07:08 AM
Sure, some people will always have a need for the fastest computer in the world. Some will find themselves stressing over the slightest increase in processor performance, screen resolution, graphics memory, whatever. No one here doubts that. But most of those people spend much more time working than reading and posting on internet message boards. Professionals use the tools that for them get the job done. I feel that the main point of using the Mac is lost on most PC users, and especially on those that cry out for the absolute fastest turbo-charged, slick, top benchmark machines. Maybe our processors are "outdated," but Mac OS X is not, nor is the work that I see coming from Mac professionals inferior to those with faster computers. The fact that OS X makes doing our jobs more elegant and faster, is far more important than whose processor is the fastest, or as Freud would put, whose >>>> is bigger.:o
shawnce
Aug 17, 11:05 AM
When playing a game on a PC, you have DirectX to take full advantage of the hardware, and your processor is usually tagged consuming any and all cycles it can for the game. On a Mac, multithreading, and sharing the processor among apps seems to be the flow of the computing experience. You should really do deeper analysis/research before making generally incorrect statements like the above.
rjohnstone
Apr 25, 03:19 PM
"Federal Marshals need a warrant. . . . . "
Duh, the police always have to jump over a higher bar . . . I, personally, can come into your home, take your bag of cocaine, and go give it to the police and it will be admissible, even though the cops need a warrant. (I can be sued for breaking and entering, etc., but the drugs are still admissible
Actually it would not be admissible.
The police would not be able to verify where it actually came from unless they actually watched you retrieve it.
At that point a good attorney would argue that you were acting as an agent of the police and the subsequent discovery and retrieval of the coke would fall under the same rules for gathering evidence and require a warrant.
The coke evidence would get tossed and you would go to jail for breaking and entering.
The officers who you handed the coke too would either be reprimanded or fired.
Duh, the police always have to jump over a higher bar . . . I, personally, can come into your home, take your bag of cocaine, and go give it to the police and it will be admissible, even though the cops need a warrant. (I can be sued for breaking and entering, etc., but the drugs are still admissible
Actually it would not be admissible.
The police would not be able to verify where it actually came from unless they actually watched you retrieve it.
At that point a good attorney would argue that you were acting as an agent of the police and the subsequent discovery and retrieval of the coke would fall under the same rules for gathering evidence and require a warrant.
The coke evidence would get tossed and you would go to jail for breaking and entering.
The officers who you handed the coke too would either be reprimanded or fired.
dmunz
Apr 8, 06:03 AM
I wonder if this has more to do with reward zone coupons and 18 month no interest financing. I always buy at Best Buy for these two reasons. Yes they are sleezeballs with cable pricing etc, but for the informed consumer, thier price/financing deals put them ahead on price controlled inventory like Apple stuff.
FWIW
DLM
FWIW
DLM
rdowns
Apr 27, 02:46 PM
Really guys? We're going to argue it may be a forgery now. :rolleyes:
playaj82
Aug 8, 07:38 AM
I know a lot of people are excited about Time Machine, but I was kind of worried last night when I showed it to one of my friends.
Unlike Expose, Fast User Switching, iTunes, Dashboard, etc... that have immediate impact and understanding as to why the features are so neat, Time Machine is actually rather complicated.
I explained and showed it to my friend, and she said, "so what, when I delete something it stays on the hard drive anyways"
All of us here obviously understand the significance of this program, but does anybody else think this will be difficult to market to the "average" user.
Unlike Expose, Fast User Switching, iTunes, Dashboard, etc... that have immediate impact and understanding as to why the features are so neat, Time Machine is actually rather complicated.
I explained and showed it to my friend, and she said, "so what, when I delete something it stays on the hard drive anyways"
All of us here obviously understand the significance of this program, but does anybody else think this will be difficult to market to the "average" user.
yac_moda
Jul 20, 03:07 PM
eight cores + Tiger = Octopussy?!?
NOW THAT, would be one CRAZZZZZYYY little baby POOOOOP :eek: :eek: :eek:
Maybe, Mac raised to the power of INFINITY -- FOR ALL YOU INFINITY LOOP LOVERS -- mobius loop that is !?!?!?!?!!?? :p
Of course, Moby would have to a do a recording studio promo for that one or maybe http://www.mobiusmusic.com/.
NOW THAT, would be one CRAZZZZZYYY little baby POOOOOP :eek: :eek: :eek:
Maybe, Mac raised to the power of INFINITY -- FOR ALL YOU INFINITY LOOP LOVERS -- mobius loop that is !?!?!?!?!!?? :p
Of course, Moby would have to a do a recording studio promo for that one or maybe http://www.mobiusmusic.com/.
janstett
Sep 13, 01:11 PM
Sheesh...just when I'm already high up enough on Apple for innovating, they throw even more leaps and bounds in there to put themselves even further ahead. I can't wait 'til my broke @$$ can finally get the money to buy a Mac and chuck all my Windows machines out the door.
I'm sure we'll see similar efforts from other PC manufacturers eventually, but let's see the software use those extra cores in Windows land. Ain't gonna happen...not on the level of what Apple's doing at least.
First, this is INTEL innovating, not Apple.
Second, Apple has been the one lagging behind on multiprocessor support. Pre OSX it was a joke of a hack to support multi CPUs in Mac OS and you had to have apps written to take advantage of it with special libraries.
On Windows, the scheduler automatically handles task scheduling no matter how many processors you have, 1 or 8. Your app doesn't have to "know" it's on a single or multiple processor system or do anything special to take advantage of multiple processors, other than threading -- which you can do on a single processor system anyway. Most applications are lazy and unimaginative, and do everything in a single thread (worse, the same thread that is processing event messages from the GUI, which is why apps lock up -- when they end up in a bad state they stop processing events from the OS and won't paint, resize, etc.). But when you take advantage of multithreading, there are some sand traps but it's a cool way to code and that's how you take advantage of multiple cores without having to know what kind of system you are on. I would assume OSX, being based on BSD, is similar, but I don't know the architecture to the degree I know Windows.
In Windows, you can set process "affinity", locking it down to a fixed processor core, through Task Manager. Don't know if you can do that in OSX...
I'm sure we'll see similar efforts from other PC manufacturers eventually, but let's see the software use those extra cores in Windows land. Ain't gonna happen...not on the level of what Apple's doing at least.
First, this is INTEL innovating, not Apple.
Second, Apple has been the one lagging behind on multiprocessor support. Pre OSX it was a joke of a hack to support multi CPUs in Mac OS and you had to have apps written to take advantage of it with special libraries.
On Windows, the scheduler automatically handles task scheduling no matter how many processors you have, 1 or 8. Your app doesn't have to "know" it's on a single or multiple processor system or do anything special to take advantage of multiple processors, other than threading -- which you can do on a single processor system anyway. Most applications are lazy and unimaginative, and do everything in a single thread (worse, the same thread that is processing event messages from the GUI, which is why apps lock up -- when they end up in a bad state they stop processing events from the OS and won't paint, resize, etc.). But when you take advantage of multithreading, there are some sand traps but it's a cool way to code and that's how you take advantage of multiple cores without having to know what kind of system you are on. I would assume OSX, being based on BSD, is similar, but I don't know the architecture to the degree I know Windows.
In Windows, you can set process "affinity", locking it down to a fixed processor core, through Task Manager. Don't know if you can do that in OSX...
gibbz
Apr 27, 08:13 AM
This is a lie
Keeping a database of our general location is logging our location. :mad:
No it isn't. They say they are not logging your location. This is correct. If it were incorrect, they would be keeping a database of your phone's exact GPS location. Instead, as they state, they are keeping a cache of the cell towers and wifi hotspots in order to aid the A-GPS system. So, no, they are not logging your (and by your, I mean an identifiable log) exact locations and beaming it home to watch you like big brother.
As has been stated a million times, there is a likely bug that wasn't culling the cache. It was also a dumb oversight to backup the file and to do so unencrypted.
The overlord hyperbole is really silly.
Keeping a database of our general location is logging our location. :mad:
No it isn't. They say they are not logging your location. This is correct. If it were incorrect, they would be keeping a database of your phone's exact GPS location. Instead, as they state, they are keeping a cache of the cell towers and wifi hotspots in order to aid the A-GPS system. So, no, they are not logging your (and by your, I mean an identifiable log) exact locations and beaming it home to watch you like big brother.
As has been stated a million times, there is a likely bug that wasn't culling the cache. It was also a dumb oversight to backup the file and to do so unencrypted.
The overlord hyperbole is really silly.
inhrntlyunstabl
Apr 27, 09:54 AM
And I'm sure when the next Apple-gate story gets created, the blind fanbois will jump to their defense. :rolleyes:
Hey Birther, guess what else happened today?! :eek:
Too many conspiracy addicts out there. Let it go and live your life.
Hey Birther, guess what else happened today?! :eek:
Too many conspiracy addicts out there. Let it go and live your life.
RebootD
Apr 6, 12:40 PM
Plus to everyone saying 'digital distribution!' in the US we have 'data caps' and to send one blu-ray size 2hr movie (not compressed to hell with 2ch stereo MKVs) it would eat up 1/4 of my monthly bandwidth per movie.
I agree that digital distribution IS the future but we are a long ways away from having 100+Mbps constant stream broadband without caps as long as a handful of ISP's have all the control. So for now blu-ray is a wonderful alternative.
Let me be clear - FCS needs a robust blu-ray authoring feature. We don't live in a wireless world where you can transmit video free over the air. We still put disks in a player to watch and also preserve our video memories.
Not having a good blu-ray authoring feature is a huge problem for Final Cut Studio. Not only does it impact professional wedding video-graphers, but ordinary people who want to put their video on a disk to send to people. I can't just put my video on netflix to have a friend watch it on his ROKU.
I agree that digital distribution IS the future but we are a long ways away from having 100+Mbps constant stream broadband without caps as long as a handful of ISP's have all the control. So for now blu-ray is a wonderful alternative.
Let me be clear - FCS needs a robust blu-ray authoring feature. We don't live in a wireless world where you can transmit video free over the air. We still put disks in a player to watch and also preserve our video memories.
Not having a good blu-ray authoring feature is a huge problem for Final Cut Studio. Not only does it impact professional wedding video-graphers, but ordinary people who want to put their video on a disk to send to people. I can't just put my video on netflix to have a friend watch it on his ROKU.
dethmaShine
Apr 19, 02:35 PM
Wrong. Just because a company released one phone that has a similar look as the iPhone doesn't mean their current offerings are a progression of that phone. It's a true testament as to who browses this forum if you honestly think that. The F700 didn't run an advanced OS, so it probably ran Symbian or used BREW. That means all Samsung did was create a theme. How does a theme they made 3 years prior to the Galaxy S mean it's a progression on the coding and UI they built? It doesn't. Here's a list of every Samsung phone: http://en.wikipedia.org/wiki/Category:Samsung_mobile_phones Now, pick out one of those and say it inspired all of their new devices 3 years later.
The F700 was an iPhone clone with a keyboard. It's depressing that people are saying that the iPhone copied its own clone.
Just look at his post history and you'll understand that you are arguing in vain.
The F700 was an iPhone clone with a keyboard. It's depressing that people are saying that the iPhone copied its own clone.
Just look at his post history and you'll understand that you are arguing in vain.
JoeG4
Nov 29, 12:56 AM
In other news: universal thinks they're god.
Glideslope
Apr 25, 03:50 PM
i would bet anything that these two "customers" happen to also be lawyers.
They just can't earn clean money, always have to rip some one to earn it.
+1 ;)
They just can't earn clean money, always have to rip some one to earn it.
+1 ;)
Renegate
Aug 8, 01:32 AM
I don't know what there is to be underwhelmed about; the rumor has basically been that the main things being covered here would be the Mac Pro (which exceeded my expectations) and the first real glimpse at Leopard (which looks very cool from what I've seen). I didn't find either the Mac Pro or Leopard to be underwhelming, so I don't see anything that would make me feel underwhelmed.
I guess I would be underwhelmed if I had mistaken WWDC for Macworld or something, and expected a bunch of major new product announcements.
And don't forget they said : More things to be announced next week
I guess I would be underwhelmed if I had mistaken WWDC for Macworld or something, and expected a bunch of major new product announcements.
And don't forget they said : More things to be announced next week
iGary
Sep 12, 11:02 AM
The folks over at Anandtech have dropped engineering samples of the quad core cloverton into a Mac Pro - http://www.anandtech.com/mac/showdoc.aspx?i=2832&p=6
and it worked ... all eight cores were recognised.
The rest of the article was interesting too.
This willl probably be the update I purchase next year - if it makes it into the Mac Pro - thanks for the link.
and it worked ... all eight cores were recognised.
The rest of the article was interesting too.
This willl probably be the update I purchase next year - if it makes it into the Mac Pro - thanks for the link.
rdowns
Apr 28, 08:04 AM
Step out of your little fairytale world
I loves me some irony.
I loves me some irony.
No comments:
Post a Comment