Bubba Satori
Mar 26, 12:23 PM
Is Apple moving to close the source on more and more of OS X ?
Yes, as more and more of iOS moves into OS X.
Yes, as more and more of iOS moves into OS X.
Blasterzilla
Apr 27, 08:24 AM
Great, glad Apple did something. Hope we can all move on now to bigger and better things.
Thanks again Apple.
Thanks again Apple.
marksman
Mar 31, 04:57 PM
Only if you do not add products like the iPad and the iPod Touch. In other words, if you throw out 50% of the iOS products.
I would add I never understand the comparison of Smartphones running Android to smartphones running IOS.
Neither Google or Apple sell their phone operating systems, and the Android spectrum is made up of 50 handsets from 10 different manufacturers who are in direct competition with each other. They are not one big group working together to take on Apple. It makes absolutely zero sense to make that kind of comparison.
It is just as weird as loping off iPod and iPad IOS users...
If people want to compare smartphones, then compare actual sales of individual smartphones, each which only use one OS. People should not draw meaningless lines in the sand lumping all android based handsets together, because they are not together other than they run android. They might as well compare black phones to white phones.
I imagine if you made a chart of the top selling smartphones in the last 5 years, it would consist of the iPhone 4, the iPhone 3GS, the iPhone 3G and the iPhone.
Why not group smartphones by what kind of graphics chip they have or what type of memory chip they use? The OS is irrelevant. Nobody in the smartphone business is directly making money off any of these oses, it is a stupid way to categorize smart phones.
Of course it happens because if they didn't lump them together it would look absurd with Apple totally dominating the smart phone market with their latest phone every year while 100 android commodity phones all have tiny market shares just to get replaced by the next one.
How does HTC running android OS benefit or relate to a Motorola phone running android? It does not, at all.
I would add I never understand the comparison of Smartphones running Android to smartphones running IOS.
Neither Google or Apple sell their phone operating systems, and the Android spectrum is made up of 50 handsets from 10 different manufacturers who are in direct competition with each other. They are not one big group working together to take on Apple. It makes absolutely zero sense to make that kind of comparison.
It is just as weird as loping off iPod and iPad IOS users...
If people want to compare smartphones, then compare actual sales of individual smartphones, each which only use one OS. People should not draw meaningless lines in the sand lumping all android based handsets together, because they are not together other than they run android. They might as well compare black phones to white phones.
I imagine if you made a chart of the top selling smartphones in the last 5 years, it would consist of the iPhone 4, the iPhone 3GS, the iPhone 3G and the iPhone.
Why not group smartphones by what kind of graphics chip they have or what type of memory chip they use? The OS is irrelevant. Nobody in the smartphone business is directly making money off any of these oses, it is a stupid way to categorize smart phones.
Of course it happens because if they didn't lump them together it would look absurd with Apple totally dominating the smart phone market with their latest phone every year while 100 android commodity phones all have tiny market shares just to get replaced by the next one.
How does HTC running android OS benefit or relate to a Motorola phone running android? It does not, at all.
Stella
Aug 7, 04:43 PM
Is Leopard going to take advantage of the 64 bit Dual G5?
Whats the point? Its history.
My guess is, that its how Tiger is now.
Whats the point? Its history.
My guess is, that its how Tiger is now.
M-O
Apr 6, 07:01 PM
Apple should forget intel and put a quad-core A6 chip in the MacBook Air. Re-architecture Mac OS to run on ARM (OS Xi) and rule the world.
it may sound crazy now, but you'll see. if anyone knows how to change architectures its Apple. we all know they've got OS X running on an iPad already it the labs.
it may sound crazy now, but you'll see. if anyone knows how to change architectures its Apple. we all know they've got OS X running on an iPad already it the labs.
wpotere
Apr 27, 09:34 AM
This is a witch hunt and won't end. The man has been our president for 2+ years now, they need to let it go. Just another reason that Trump is and looks like an idiot.
geerlingguy
Aug 16, 11:24 PM
When rendering in FCP, it's all about the CPU.
Fast hard drives contribute to real-time effects, but do NOT contribute to rendering.
Ram helps a little bit.
However, depending on what kind of rendering you're doing, the hard drive can be a limiting factor.
Say you're just rendering ten minutes worth of a blur effect on video�the CPU says 'gimme all you got' and goes to town on the frames, blurring each one quickly. But the hard drive may have a hard time keeping up with the CPU, because 10 minutes of footage needs to be read, then re-written to the drive. For HD-resolution video, that can be a couple gigs of data. And that data also has to pass through the RAM (which acts like a high-speed buffer).
However, in the case of these benchmarks, one would think the testers would choose some more CPU-intense rendering, which would allow the hard drive to take it's time while the CPU is overloaded with work.
But, to anyone configuring a graphics or video workstation: Everything�CPU, Hard Drives, RAM, and even the GPU for some tasks�should be as fast and ample as possible. "A chain is only as good as it's weakest link." If you pair up a Quad 3.0 GHz Xeon with a 5400 rpm USB 2.0 drive, you will be disappointed.
Fast hard drives contribute to real-time effects, but do NOT contribute to rendering.
Ram helps a little bit.
However, depending on what kind of rendering you're doing, the hard drive can be a limiting factor.
Say you're just rendering ten minutes worth of a blur effect on video�the CPU says 'gimme all you got' and goes to town on the frames, blurring each one quickly. But the hard drive may have a hard time keeping up with the CPU, because 10 minutes of footage needs to be read, then re-written to the drive. For HD-resolution video, that can be a couple gigs of data. And that data also has to pass through the RAM (which acts like a high-speed buffer).
However, in the case of these benchmarks, one would think the testers would choose some more CPU-intense rendering, which would allow the hard drive to take it's time while the CPU is overloaded with work.
But, to anyone configuring a graphics or video workstation: Everything�CPU, Hard Drives, RAM, and even the GPU for some tasks�should be as fast and ample as possible. "A chain is only as good as it's weakest link." If you pair up a Quad 3.0 GHz Xeon with a 5400 rpm USB 2.0 drive, you will be disappointed.
iMikeT
Aug 25, 03:48 PM
I tell you, I've had nothing but trouble with Apple. I'm young, I'm a medical student (so relatively affluent), and I'm a "switcher." That switching part, that was a mistake. Mac OS X is beautiful software, I love it. Unfortunately I've had a lot of problems with the hardware. These days it's enough I wish I still had my IBM/Lenovo laptop--that never gave me problems.
i love you aby girl.
I love u guys!
i love u lissy!! happy
March 29, 2011. My aby girl
Flower Design Baby Girl “I
I Posted this pic so U get an
love you aby girl. i love you
i love you aby girl.
i love you aby girl.
my beautiful abygirl sonia
I Love You
blahblah100
Mar 31, 05:03 PM
Ah linux trolls are my favorite :rolleyes: I lost count how many times I've answered a question and/or posted on something to have the random linux guy show up and spout "Or just toss out your mac/pc and install linux on a new machine". Of course no one asked about linux.
What?
What?
zacman
Apr 20, 03:46 AM
And the design was released after the iPhone was out.
No, it was shown at IFA 2006 for the first time but "officially presented" a few months later.
No, it was shown at IFA 2006 for the first time but "officially presented" a few months later.
samcraig
Apr 27, 09:10 AM
Side story: the credit card companies know exactly where I am better then the cell companies. Every time I swipe my credit or debit card, they know where I am. When I travel for vacation, I am very likely to get a call from my credit card company (on my cell) asking where, when and how long I will be traveling. They know every store and every purchase I've ever made on a credit card.
again - when you make a purchase - you know you're being logged. If you use cash - your CC doesn't know where you are.
Apple's bug saved coordinates whether or not you had locations services on or off. It's different.
The OPTION is what's important and Apple agrees, hence the bug fix. If it was soley a "feature" - they would have stated that the file is required and they cannot offer a way to remove it, yadda yadda
Those that still argue against the solution remind me of the threads on the iPad board. When it was suggested that the iPad needed a camera - so many people were screaming that it's ridiculous for the iPad to have a camera citing form factor, useless feature, stupid suggestion, etc. I argued that having a camera makes sense and for those that wouldn't use it - don't use it.
Same here. Apple will give (actually fix) the ability to turn location services on or off. Use it or not. I'm happy there's an OPTION
again - when you make a purchase - you know you're being logged. If you use cash - your CC doesn't know where you are.
Apple's bug saved coordinates whether or not you had locations services on or off. It's different.
The OPTION is what's important and Apple agrees, hence the bug fix. If it was soley a "feature" - they would have stated that the file is required and they cannot offer a way to remove it, yadda yadda
Those that still argue against the solution remind me of the threads on the iPad board. When it was suggested that the iPad needed a camera - so many people were screaming that it's ridiculous for the iPad to have a camera citing form factor, useless feature, stupid suggestion, etc. I argued that having a camera makes sense and for those that wouldn't use it - don't use it.
Same here. Apple will give (actually fix) the ability to turn location services on or off. Use it or not. I'm happy there's an OPTION
iGary
Sep 13, 07:14 AM
DAMN :eek:
so 2-3 years from now are people going to be asking "do I need a quad core or an 8 core macbook? oh yeah I'll mostly be surfing the web and maybe editing a photo once and a while" :rolleyes:
*waits for software to catch up*
so 2-3 years from now are people going to be asking "do I need a quad core or an 8 core macbook? oh yeah I'll mostly be surfing the web and maybe editing a photo once and a while" :rolleyes:
*waits for software to catch up*
shawnce
Jul 27, 07:04 PM
looking at reference systems - for $2049, Gateway's Core 2 Duo gets the 2.4GHz/4MB L2 cache Conroe, 2GB of RAM from the factory, an x1900 512MB graphics card, 320GB hard drive, card reader and DL DVD burner.
make sure to note that is an ATI X1900 CrossFire XT adapter
make sure to note that is an ATI X1900 CrossFire XT adapter
OrangeSVTguy
Apr 25, 04:23 PM
Guess we all now know what that new data center is going to be used for now.
Magrathea
Apr 6, 11:15 PM
Youre aware the newest mbp (high end) 15, and 17 haveva 1gb graphics memory, right?
Yes but not Nvidia so I don't think they can use the CUDA think. correct my if I'm wrong where PP gurus.
Yes but not Nvidia so I don't think they can use the CUDA think. correct my if I'm wrong where PP gurus.
fivepoint
Apr 28, 09:50 AM
Imagine that, three responses which utterly fail to refute let alone dispute my clear and truthful argument. Instead, they leave snide remarks. No substance WHATSOEVER. :)
Burnsey
Apr 27, 11:00 AM
http://www.freerepublic.com/focus/f-news/2711155/posts?q=1&;page=101
There you have it. The birthers aren't satisfied. I knew it.
If this birth certificate said that Obama wasn't born in the US they would be singing a different tune. Heck they would be singing a different tune given the tiniest most unlikely evidence that he wasn't born in the US.
There you have it. The birthers aren't satisfied. I knew it.
If this birth certificate said that Obama wasn't born in the US they would be singing a different tune. Heck they would be singing a different tune given the tiniest most unlikely evidence that he wasn't born in the US.
CIA
Apr 7, 10:41 PM
As best as I can figure, it works like this. Managers get good grades if they sell certain amounts of products.
I'll use low numbers here. Let's say BB corporate wants you to sell at least 5 iPads a day to make your "Quota". One day, 10 iPads come in. You sell all ten, yay, you made quota for the day.
But the next day, none get shipped to the store. So, boo, you didn't make quota, since you didn't have any to sell.
So, if you get 10 the day after that, & not knowing if more are coming tomorrow, you sell 5, make quota, and hold the other 5 for the next day when, low and behold, none get shipped to the store. You still have 5 left over to sell, which you do, and again you make quota for the day.
Basically the more days you make quota, the happier BB corporate is, and the better chance Mr. Manager gets a bonus down the road.
Mr. Manager (http://www.youtube.com/watch?v=O4DMPmoJkJQ)
I'll use low numbers here. Let's say BB corporate wants you to sell at least 5 iPads a day to make your "Quota". One day, 10 iPads come in. You sell all ten, yay, you made quota for the day.
But the next day, none get shipped to the store. So, boo, you didn't make quota, since you didn't have any to sell.
So, if you get 10 the day after that, & not knowing if more are coming tomorrow, you sell 5, make quota, and hold the other 5 for the next day when, low and behold, none get shipped to the store. You still have 5 left over to sell, which you do, and again you make quota for the day.
Basically the more days you make quota, the happier BB corporate is, and the better chance Mr. Manager gets a bonus down the road.
Mr. Manager (http://www.youtube.com/watch?v=O4DMPmoJkJQ)
Gugulino
Apr 6, 01:00 PM
The quality of a blu ray film is superior to all forms of digital distribution over the internet, like iTunes for example and it is a huge improvement over DVD. I can't understand why people still stick with DVD. Like Apple! Macs have no blu ray disc tray, only DVD. I can not understand that!
When you have all these great HD Camcorders and great movie editing software on a Mac why you should burn a DVD and loose most of the quality. Sure, you can upload HD movies to YouTube or Vimeo directly from iMovie, but it is not the same quality as, if you would burn a blu ray. At least the Mac Pro should have an option for a blu ray disc tray and DVD Studio Pro should support blu ray authoring.
I hope Apple will do a step in this direction with the new FC Studio.
When you have all these great HD Camcorders and great movie editing software on a Mac why you should burn a DVD and loose most of the quality. Sure, you can upload HD movies to YouTube or Vimeo directly from iMovie, but it is not the same quality as, if you would burn a blu ray. At least the Mac Pro should have an option for a blu ray disc tray and DVD Studio Pro should support blu ray authoring.
I hope Apple will do a step in this direction with the new FC Studio.
mmmcheese
Aug 11, 02:32 PM
Although I'd be interested in an Apple created phone (depending on what it turned out to be), I doubt they will come out with a CDMA version....so in the end I'll be SOL anyway...
samcraig
Apr 27, 10:05 AM
No they won't stand out in the data, because each cell tower or Wi-Fi hotspot is only included once in the database. And there is no information regarding how much time you spend in each location.
If locations are recorded AND time/date stamp - then how much time you spend in each location is tracked inherently. If you "log in" at one time here and then another 20 minutes later - there's a history of time spent. Maybe not foolproof... but to say that no information is there isn't accurate.
If locations are recorded AND time/date stamp - then how much time you spend in each location is tracked inherently. If you "log in" at one time here and then another 20 minutes later - there's a history of time spent. Maybe not foolproof... but to say that no information is there isn't accurate.
mikewilder23
Jun 21, 03:43 AM
well looking forward for its launching...:)
canyonblue737
Apr 27, 07:58 AM
That's good enough for me.
Apple's only screw up here was keeping the infinite database forever on your phone and backed up to your Mac. Their was no reason to back it up to the computer and no reason to keep the data on the phone after it was passed to Apple (encrypted, de-identified etc.) but I suspect the reason was simply "we weren't doing anything bad with it so we never even considered we should delete it later."
Good job Apple. Now let's move on to someone else, like freakin' Sony and their Playstation network.
Apple's only screw up here was keeping the infinite database forever on your phone and backed up to your Mac. Their was no reason to back it up to the computer and no reason to keep the data on the phone after it was passed to Apple (encrypted, de-identified etc.) but I suspect the reason was simply "we weren't doing anything bad with it so we never even considered we should delete it later."
Good job Apple. Now let's move on to someone else, like freakin' Sony and their Playstation network.
gnasher729
Aug 17, 03:57 AM
[QUOTE=jicon]Lots of stuff on Anandtech about the poor memory performance on the Intel chipset./QUOTE]
FB Dimms are not designed to give maximum bandwidth to one chip, they are designed to give maximum bandwidth to _four_ cores. Instead of having _one_ program running to test memory bandwidth, they should have started four copies of it and see what happens. That is what you have doubled front side bus, buffered memory and two separate memory units for. The biggest criticism in the past against Intel multi-CPU systems was that the memory bandwidth didn't scale; in the Mac Pro, it does.
FB Dimms are not designed to give maximum bandwidth to one chip, they are designed to give maximum bandwidth to _four_ cores. Instead of having _one_ program running to test memory bandwidth, they should have started four copies of it and see what happens. That is what you have doubled front side bus, buffered memory and two separate memory units for. The biggest criticism in the past against Intel multi-CPU systems was that the memory bandwidth didn't scale; in the Mac Pro, it does.
No comments:
Post a Comment