Deimo
Jul 11, 11:17 PM
Here's a little list i put together last week of my predictions for the next 6 months or so of a roadmap (whenever merom goes to 800 MHz on its bus, so maybe 9 months)
Portable:
MacBook: Yonah through 1q 667MHz bus Merom thereafter
MacBook Pro: Yonah through 3q2006, 667MHz bus Merom through 1q2007,
800MHz bus Merom thereafter
Desktop:
Mac mini: Yonah through 1q2007, 667MHz bus Merom thereafter
iMac: Yonah through 3q2006, 800MHz bus Conroe thereafter
Mac Pro: 1333MHz bus Woodcrest
Portable:
MacBook: Yonah through 1q 667MHz bus Merom thereafter
MacBook Pro: Yonah through 3q2006, 667MHz bus Merom through 1q2007,
800MHz bus Merom thereafter
Desktop:
Mac mini: Yonah through 1q2007, 667MHz bus Merom thereafter
iMac: Yonah through 3q2006, 800MHz bus Conroe thereafter
Mac Pro: 1333MHz bus Woodcrest
zacman
Oct 7, 01:41 PM
Apple already seems to have lost some parts of the European market with the 3GS because they didn't add the features that are frequently used there (like HSUPA, (r)SAP, etc.). For example GFK numbers showed that the Android based HTC Hero outsold the 3GS in Germany.
WiiDSmoker
Apr 20, 07:47 PM
You obviously don't work in IT or no anything about how viruses are spread. Windows can get a virus just by being on a network with an infected machine or opening an email in Outlook from someone on an infected machine. I fix these kind of issues for a living and see it all the time. The truth is its insanely easy for viruses to get onto, and hide in Windows. Windows allows the files to completely hide themselves even if hidden and system files are set to show. The only way to see them on an infected machine is to yank the hard drive and plug it into a mac or linux based machine then you can spot hidden infected files if you know where they are located.
So please, don't start with the "as long as users are smart" myth. It can easily happen to anyone, its a flaw in the OS.
No, it's a flaw with being the market leader.
So please, don't start with the "as long as users are smart" myth. It can easily happen to anyone, its a flaw in the OS.
No, it's a flaw with being the market leader.
Naimfan
Apr 24, 11:55 AM
Not at all. I think anyone who identifies as a Christian is a Christian by definition. I just think that the lengths some goto rationalise their beliefs are ridiculous. Why bother being a Christian at all if you are going to change some of the core tenants of the belief.
I am mean I heard the other day (second hand so apply salt liberally) that some Christians are even changing the whole holy trinity thing so that it is less "way out there".
My general thinking on this is that if you can "interpret" so much of the Bible then why do you need a centralised religion at all? Why isn't anyone who believes in a god (any god) a Christian if the definition is so liberal? The only thing that seems constant in Christianity is that every denomination considers the Bible to be their holy book. Everything else, including the meaning whether literal or interpreted is completely up for grabs.
Perhaps you should define what you mean, then. Definitionally, to be a "Christian" generally means a belief in God, a belief that Jesus was God's son on earth, and a belief in the death and resurrection of Jesus as expiation of humanity's sins. Everything else is open to interpretation--even those denominations you think believe the Bible "literally" do not.
I am mean I heard the other day (second hand so apply salt liberally) that some Christians are even changing the whole holy trinity thing so that it is less "way out there".
My general thinking on this is that if you can "interpret" so much of the Bible then why do you need a centralised religion at all? Why isn't anyone who believes in a god (any god) a Christian if the definition is so liberal? The only thing that seems constant in Christianity is that every denomination considers the Bible to be their holy book. Everything else, including the meaning whether literal or interpreted is completely up for grabs.
Perhaps you should define what you mean, then. Definitionally, to be a "Christian" generally means a belief in God, a belief that Jesus was God's son on earth, and a belief in the death and resurrection of Jesus as expiation of humanity's sins. Everything else is open to interpretation--even those denominations you think believe the Bible "literally" do not.
TangoCharlie
Jul 12, 02:50 AM
As even AI note, there's not much difference between the two chips.
The cores for all the "Core 2" processors are all basically the same, but the packaging is different. Using Xeon 5100 in the Mac Pro makes sense because they are going to want to use dual-cpu (quad core) configurations. Although this may not seem of much importance, the Xeon will cost a lot more, which is an issue.
I still maintain that there's a "hole" in the new line-up, which is there isn't a single-cpu high-clock-rate system. I think Apple needs a Core 2 Extreme based system with the Conroe XE CPU (initially 2.93 GHz then 3.2 GHz).
Oh.... I think the recently introduced edu-iMac will keep its current Core Duo (Yonah) processor after the full iMac has been upgraded to Core 2 Duo. Another thing..... I think the iMac will get Meroms, not Conroes so that Apple doesn't have to change the socket. (Which also implies that the top CPU speed we're going to see in the iMac will be 2.33GHz, leaving a space for faster (2.4GHz to 2.93GHz) in a new enclosure. :cool:
The cores for all the "Core 2" processors are all basically the same, but the packaging is different. Using Xeon 5100 in the Mac Pro makes sense because they are going to want to use dual-cpu (quad core) configurations. Although this may not seem of much importance, the Xeon will cost a lot more, which is an issue.
I still maintain that there's a "hole" in the new line-up, which is there isn't a single-cpu high-clock-rate system. I think Apple needs a Core 2 Extreme based system with the Conroe XE CPU (initially 2.93 GHz then 3.2 GHz).
Oh.... I think the recently introduced edu-iMac will keep its current Core Duo (Yonah) processor after the full iMac has been upgraded to Core 2 Duo. Another thing..... I think the iMac will get Meroms, not Conroes so that Apple doesn't have to change the socket. (Which also implies that the top CPU speed we're going to see in the iMac will be 2.33GHz, leaving a space for faster (2.4GHz to 2.93GHz) in a new enclosure. :cool:
TennisandMusic
May 2, 11:43 AM
I'm well aware of UAC. UAC also just happens to be "that annoying popup thing" that has become extremely popular for users to disable entirely since the debut of Vista.
Uh huh. And OSX doesn't ask you to manually enter a password every time you install or change something? Windows only asks you to authorize...which is technically more "annoying"?
I actually don't know anyone who has ever disabled UAC.
Huge difference in my experience. The Windows UAC will pop up for seemingly mundane things like opening some files or opening applications for the first time, where as the OS X popup only happens during install of an app - in OS X, there is an actual logical reason apparent to the user. It is still up to the user to ensure the software they are installing is from a trusted source, but the reason for the password is readily apparent.
I've never seen the UAC when "opening some files" and of course you get it when opening some apps for the first time, since those times are often akin to installing...you know, like when you install an OSX app and it requests your password?
So now the argument is that the OSX's password requests are logical and thereby the UAC is illogical? Yeesh. :rolleyes:
These are just computers people. Not magic. They are here to help us get work done. Quit trying to prove your platform of choice is superior to someone else's platform of choice, it's really not worth it. ;)
Uh huh. And OSX doesn't ask you to manually enter a password every time you install or change something? Windows only asks you to authorize...which is technically more "annoying"?
I actually don't know anyone who has ever disabled UAC.
Huge difference in my experience. The Windows UAC will pop up for seemingly mundane things like opening some files or opening applications for the first time, where as the OS X popup only happens during install of an app - in OS X, there is an actual logical reason apparent to the user. It is still up to the user to ensure the software they are installing is from a trusted source, but the reason for the password is readily apparent.
I've never seen the UAC when "opening some files" and of course you get it when opening some apps for the first time, since those times are often akin to installing...you know, like when you install an OSX app and it requests your password?
So now the argument is that the OSX's password requests are logical and thereby the UAC is illogical? Yeesh. :rolleyes:
These are just computers people. Not magic. They are here to help us get work done. Quit trying to prove your platform of choice is superior to someone else's platform of choice, it's really not worth it. ;)
nixd2001
Oct 12, 09:48 AM
Originally posted by MacCoaster
javajedi's Java and Cocoa/Objective-C code has been available here (http://members.ij.net/javajedi) for a couple of days. My C# port is available for examination if you e-mail me.
I was thinking of the x86 and PPC assembler produced for the core loops. I could bung the C through GCC and get some assembler on my windy tunnels, true, but I'm not geared up to do the Windows side of things.
javajedi's Java and Cocoa/Objective-C code has been available here (http://members.ij.net/javajedi) for a couple of days. My C# port is available for examination if you e-mail me.
I was thinking of the x86 and PPC assembler produced for the core loops. I could bung the C through GCC and get some assembler on my windy tunnels, true, but I'm not geared up to do the Windows side of things.
edifyingGerbil
Apr 22, 08:41 PM
In science when there is a dearth of evidence for something, you fail to reject the null hypothesis (which is that hypothesis x is incorrect).
If I wanted to make a claim about something, say that two bricks tied together will fall at the same rate as a single brick, I first have to make this my working hypothesis. The null hypothesis is that what I'm asserting is not true (in this case the null is that the bricks will fall at different rates). It's up to me to provide the evidence. If there isn't enough (or any) evidence, we fail to reject the null hypothesis.
When it comes to religion, it is the theologian who is making the claim. Thus, his working hypothesis is, "God exists." In searching for evidence, however, we come up with nothing. Thus we must fail to reject the null hypothesis, which is, "God does not exist."
Agnosticism is really the position that the an affirmative statement on the matter of deities is impossible to know. It doesn't have a rational basis in logic or science, thought it might make some people more comfortable with their skepticism.
Atheism is the position that, based on currently available evidence, there is no basis to consider any deity to be real. This could change as new evidence comes to light, of course. That is a quality you will not find in theism or agnosticism.
As I said in my first post, most atheists that I speak to don't put this much thought and care into their atheism. They just take it for granted that it won't be challenged.
How can you prove something's existence that exists outside of time and space? I don't think it's possible except through pure reason.
If I wanted to make a claim about something, say that two bricks tied together will fall at the same rate as a single brick, I first have to make this my working hypothesis. The null hypothesis is that what I'm asserting is not true (in this case the null is that the bricks will fall at different rates). It's up to me to provide the evidence. If there isn't enough (or any) evidence, we fail to reject the null hypothesis.
When it comes to religion, it is the theologian who is making the claim. Thus, his working hypothesis is, "God exists." In searching for evidence, however, we come up with nothing. Thus we must fail to reject the null hypothesis, which is, "God does not exist."
Agnosticism is really the position that the an affirmative statement on the matter of deities is impossible to know. It doesn't have a rational basis in logic or science, thought it might make some people more comfortable with their skepticism.
Atheism is the position that, based on currently available evidence, there is no basis to consider any deity to be real. This could change as new evidence comes to light, of course. That is a quality you will not find in theism or agnosticism.
As I said in my first post, most atheists that I speak to don't put this much thought and care into their atheism. They just take it for granted that it won't be challenged.
How can you prove something's existence that exists outside of time and space? I don't think it's possible except through pure reason.
NebulaClash
Apr 28, 08:25 AM
What are tablets going to overtake? I just dont get it... Desktops? Laptops?
I can see hybrid solutions, like the ASUS EEE Tablet. But they are not NEARLY powerful enough to run certain applications. I just dont see large businesses, such as the government replacing laptop, and desktop with tablets!? not in th next 10 years DEFINATELY.
Got it, it's a definite prediction.
What are tablets going to overtake? Yes, desktops and latops. In 2020 the average person will buy a tablet as their dominant computer. Techies will still use traditional technology such as PCs, and specialists will continue to do so, but since there are FAR more average persons then such specialists and techies, the number of tablets sold in 2020 will exceed the number of traditional PCs. That's my prediction.
I can see hybrid solutions, like the ASUS EEE Tablet. But they are not NEARLY powerful enough to run certain applications. I just dont see large businesses, such as the government replacing laptop, and desktop with tablets!? not in th next 10 years DEFINATELY.
Got it, it's a definite prediction.
What are tablets going to overtake? Yes, desktops and latops. In 2020 the average person will buy a tablet as their dominant computer. Techies will still use traditional technology such as PCs, and specialists will continue to do so, but since there are FAR more average persons then such specialists and techies, the number of tablets sold in 2020 will exceed the number of traditional PCs. That's my prediction.
LQYoshi
Apr 11, 11:01 AM
Unlikely, but you can install Lion on an external drive and boot from that when you want to.
B
Would it be considered switching if I bought the mini? I"ll still have a few laptops which I'll be using with XP, but then again; I can just VNC to the OSX mac mini
B
Would it be considered switching if I bought the mini? I"ll still have a few laptops which I'll be using with XP, but then again; I can just VNC to the OSX mac mini
Sydde
Apr 22, 08:50 PM
Atheists often, rightly or wrongly, seem to count agnostics in their number much as Blues is of classified as a part of Jazz (wrongly, IMO).
This document from census.gov (http://www.census.gov/compendia/statab/2011/tables/11s0075.pdf) looks to me like it is showing a fairly steady increase in unbelief, which can only be a good thing.
On this forum, there only appear to be a lot of atheists because they tend to be outspoken, put forth strong arguments (the strength of which may be a matter of opinion), and respond quickly to religious nonsense.
This document from census.gov (http://www.census.gov/compendia/statab/2011/tables/11s0075.pdf) looks to me like it is showing a fairly steady increase in unbelief, which can only be a good thing.
On this forum, there only appear to be a lot of atheists because they tend to be outspoken, put forth strong arguments (the strength of which may be a matter of opinion), and respond quickly to religious nonsense.

G58
Oct 15, 07:39 AM
Some conventions are worth adopting, if only for the reasons they are created. For instance, when writing in the English language, the convention is to begin at the left, with each sentence starting with an upper case letter.
Now, I have no evidence to guide me here, but I suspect you're either lazy, or your shift key has broken on your keyboard. PCs do tend to ship with poor, cheap keyboards based on a thirty year old design.
But the important thing is that no matter if your points were in some small way credible, by presenting them the way you have, you've rendered the possibility of their credibility less easy to discern.
Thank you for participating. The exit is on the left and the keyboard repair service is next to the typing 101 class.
However, I love Google for many reasons. However, none of them is not that they make great hardware, support great software, support great hardware, or understand how to do any of these.
Google's support of Adroid is both admirable and, to a large extent altruistic, as well as an attempt to expand into other markets. But like Amazon, they don't understand the game. The kindle, for instance is actually useless as a textbook medium, yet this hasn't stopped Bezos from hawking it as such.
Apple's iPhone works because it has lineage, in terms of history, hardware and software development, and integrity, as well as reliability, developer support and marketing advantage. iMac begat PowerBook Ti, begat iPod, begat iPhone. NeXT begat Darwin, begat Mac OS X, begat iPhone OS. None of this is an accident. Apple designed this process. And they began in 1997 - if not earlier.
Android only began as a techie wet dream in and is the 21st Century answer to the Kibbutz, or workers' collective. Both were very optimistic ideas with worthy ideals. But both failed because they relied upon a greater input of encouragement and resources than they were ever capable of producing in terms of meaningful contribution or profits.
I'm sure there may well come a day when there are 125,000 developers working on Android applications. There may even be 85,000 applications available for the Android platform too - from some dark corners of the net. But no matter how many manufacturers jump on the Android handset bandwagon, none of them will come close to creating a coherent user-base, or to matching Apple's business model.
And that, my dear typographically challenged friend is the key here. Ultimately, numbers are irrelevant if they only represent a fragmented 'diaspora' of the Android faithful. The sum total will only ever be quotable as a statistic.
the reason this topic has gotten so long is due to the fact that most apple fans have no idea what they're talking about..
they love apple and they will defend it to the death, even when their argument has no logic..
this has nothing to do with which product is better..
it's the simple fact that android will be available on a greater number of handsets compared to apple..
you guys need to look at the Microsoft vs Apple situation..
regardless of what you prefer or believe is a better product,
the one that makes software and licenses it out dominates the market share
you really must have a thick skull not to understand that..
Now, I have no evidence to guide me here, but I suspect you're either lazy, or your shift key has broken on your keyboard. PCs do tend to ship with poor, cheap keyboards based on a thirty year old design.
But the important thing is that no matter if your points were in some small way credible, by presenting them the way you have, you've rendered the possibility of their credibility less easy to discern.
Thank you for participating. The exit is on the left and the keyboard repair service is next to the typing 101 class.
However, I love Google for many reasons. However, none of them is not that they make great hardware, support great software, support great hardware, or understand how to do any of these.
Google's support of Adroid is both admirable and, to a large extent altruistic, as well as an attempt to expand into other markets. But like Amazon, they don't understand the game. The kindle, for instance is actually useless as a textbook medium, yet this hasn't stopped Bezos from hawking it as such.
Apple's iPhone works because it has lineage, in terms of history, hardware and software development, and integrity, as well as reliability, developer support and marketing advantage. iMac begat PowerBook Ti, begat iPod, begat iPhone. NeXT begat Darwin, begat Mac OS X, begat iPhone OS. None of this is an accident. Apple designed this process. And they began in 1997 - if not earlier.
Android only began as a techie wet dream in and is the 21st Century answer to the Kibbutz, or workers' collective. Both were very optimistic ideas with worthy ideals. But both failed because they relied upon a greater input of encouragement and resources than they were ever capable of producing in terms of meaningful contribution or profits.
I'm sure there may well come a day when there are 125,000 developers working on Android applications. There may even be 85,000 applications available for the Android platform too - from some dark corners of the net. But no matter how many manufacturers jump on the Android handset bandwagon, none of them will come close to creating a coherent user-base, or to matching Apple's business model.
And that, my dear typographically challenged friend is the key here. Ultimately, numbers are irrelevant if they only represent a fragmented 'diaspora' of the Android faithful. The sum total will only ever be quotable as a statistic.
the reason this topic has gotten so long is due to the fact that most apple fans have no idea what they're talking about..
they love apple and they will defend it to the death, even when their argument has no logic..
this has nothing to do with which product is better..
it's the simple fact that android will be available on a greater number of handsets compared to apple..
you guys need to look at the Microsoft vs Apple situation..
regardless of what you prefer or believe is a better product,
the one that makes software and licenses it out dominates the market share
you really must have a thick skull not to understand that..
Hisdem
Mar 15, 01:39 PM
Are you drunk?
Looks like it. And BTW, I don't think the Japanese people would think leaving their homeland and going to the USA is a good idea. Not saying they don't like the US, but generally, just generally, people tend to care more about their own countries and cultures than about the American ones. Just saying.
Looks like it. And BTW, I don't think the Japanese people would think leaving their homeland and going to the USA is a good idea. Not saying they don't like the US, but generally, just generally, people tend to care more about their own countries and cultures than about the American ones. Just saying.
paulvee
Oct 26, 10:59 AM
I'm actually pretty thrilled with my new Dual 3.0 Xeon. Should hold me in good stead for a couple of years of heavy video crunching and motion graphics, as well as photoshop once it goes native. In the meantime, I use my Dual G5 2.0 for that.
And when the Octos get updated in a year and a half, I can be the first to jump on that bandwagon.
And when the Octos get updated in a year and a half, I can be the first to jump on that bandwagon.
Stelph
Apr 21, 05:37 AM
I love the title simply because it reads like its discussing Steve Jobs' involvement in fragmenting Android :D
moogleii
Apr 5, 10:16 PM
Can't just hit Delete? Can't move up a level in the directory structure? Yikes.
Ya know what? These may all be little things individually, but collectively as a whole I think they'd drive me nuts.
I'm still on Vista... maybe going to Windows 7 might be the smarter move in my particular case.
Thanks for your help everyone, I sincerely appreciate your input.
Gotta do some serious thinking about this...
You can delete from the keyboard. Command+delete. I prefer it because an accidental delete press won't throw up a prompt that you have to answer if you weren't meaning to delete anything (the little things as you say). On windows, I never delete anything unless I'm sure, so I shift-delete everything anyway (been doing that for years and still no regrets!).
Note, there are two delete keys on a mac keyboard, which is what is probably confusing thatsallfolks.
Also, if you enable "show path bar" in Finder, you can see the entire path you're in, and easily jump around.
It was weird at first, but now I actually prefer having an application's menu separate from the application's windows. You can close all of an applications windows, and now close the app. Sounds kind of pointless, but sometimes I'll accidentally close all the windows of an application under Windows, which is basically a full quit of the app, so now I have to relaunch the app, which is not always a trivial amount of time. Also weird at first was the reversal of the ctrl key with osx's cmd key, but I prefer it now too because doing crazy key combinations is much easier with the thumb than with the pinky.
The biggest gripe I have is the inability to cut and paste. I've gotten used to it, but if it's a huge deal, there's an app that mods Finder I believe that will add a cut operation. I also prefer using keyboard shortcuts whenever possible, and Windows seems to be better in that respect, although I'm always learning about new keyboard shortcuts in OSX.
For what it's worth, I've been a PC user for the past 17 years. I grudgingly bought a mac a few years ago in order to mess around with Xcode. It took about 1 month to become fully used to the differences between osx and windows, but after that, I solely used the Mac for quite some time.
I eventually upgraded my pc to windows 7, and now I spend about 50% on each. Windows 7 is pretty nice, but it still feels a bit less organized than OS X (just look at Win7's control panel, yeesh; I end up just using the run command or ctrl-fing).
Btw, OSX upgrades have traditionally been very cheap. I upgraded from Leopard to Snow Leopard for $25.
Ya know what? These may all be little things individually, but collectively as a whole I think they'd drive me nuts.
I'm still on Vista... maybe going to Windows 7 might be the smarter move in my particular case.
Thanks for your help everyone, I sincerely appreciate your input.
Gotta do some serious thinking about this...
You can delete from the keyboard. Command+delete. I prefer it because an accidental delete press won't throw up a prompt that you have to answer if you weren't meaning to delete anything (the little things as you say). On windows, I never delete anything unless I'm sure, so I shift-delete everything anyway (been doing that for years and still no regrets!).
Note, there are two delete keys on a mac keyboard, which is what is probably confusing thatsallfolks.
Also, if you enable "show path bar" in Finder, you can see the entire path you're in, and easily jump around.
It was weird at first, but now I actually prefer having an application's menu separate from the application's windows. You can close all of an applications windows, and now close the app. Sounds kind of pointless, but sometimes I'll accidentally close all the windows of an application under Windows, which is basically a full quit of the app, so now I have to relaunch the app, which is not always a trivial amount of time. Also weird at first was the reversal of the ctrl key with osx's cmd key, but I prefer it now too because doing crazy key combinations is much easier with the thumb than with the pinky.
The biggest gripe I have is the inability to cut and paste. I've gotten used to it, but if it's a huge deal, there's an app that mods Finder I believe that will add a cut operation. I also prefer using keyboard shortcuts whenever possible, and Windows seems to be better in that respect, although I'm always learning about new keyboard shortcuts in OSX.
For what it's worth, I've been a PC user for the past 17 years. I grudgingly bought a mac a few years ago in order to mess around with Xcode. It took about 1 month to become fully used to the differences between osx and windows, but after that, I solely used the Mac for quite some time.
I eventually upgraded my pc to windows 7, and now I spend about 50% on each. Windows 7 is pretty nice, but it still feels a bit less organized than OS X (just look at Win7's control panel, yeesh; I end up just using the run command or ctrl-fing).
Btw, OSX upgrades have traditionally been very cheap. I upgraded from Leopard to Snow Leopard for $25.
Huntn
Mar 13, 07:18 PM
'Renewables' are hardly without issue either. To make a decent amount of power you have to do it on a massive scale. What are your thoughts on the Chinese Three Gorges Dam?
Solar plants can be put out in the scrub, they don't destroy what can be some of the most beautiful places on Earth like dams do, and have much less land impact.
Solar plants can be put out in the scrub, they don't destroy what can be some of the most beautiful places on Earth like dams do, and have much less land impact.
rickdollar
Apr 13, 12:57 AM
I need more information before I can form an opinion about this.
Sorry, this is MacRumors. No rational statements are allowed. It's in the rules.
Sorry, this is MacRumors. No rational statements are allowed. It's in the rules.
joepunk
Mar 11, 01:16 AM
Just heard about it on CBC late night news. Terrible.
Bill McEnaney
Mar 26, 01:44 PM
To be fair, I knew what you meant with your comment, but frankly there wasn't any sarcasm in my statement. You were attempting to defend your earlier poorly-constructed post, and I was bemused by it.
I'm sorry I misinterpreted your post, SC. But if you put your mouse cursor on this :rolleyes: smiley, you'll see the word "Sarcastic."
I'm sorry I misinterpreted your post, SC. But if you put your mouse cursor on this :rolleyes: smiley, you'll see the word "Sarcastic."
rasmasyean
Apr 22, 11:47 PM
It's believed that the Higgs Boson exists but as yet there is no proof of its existence. Despite this respected physicists continue to try and prove its existence.
There are many things we believe in the existence of despite lack of tangible proof.
The Higgs Boson is something that is speculated to exist based on mathematical models and observation of other properties in theory. Therefore they try to "look for it" in order to confirm their models.
Einstein's special relativity was also speculated to exist based on mathematical models. And there was no way to observe that and "prove" that those phenomenon exist until modern equipment was invented...like GPS.
Even when Einstein derived that light travels in "particles", it explained a lot of things, but it isn't really until now that we use "photons" to bombard atoms to do quantum mechanical work...like solar panels. But they were derived to exist based on some other doctrine that works in real life (not just your mind).
There is a line between using an established doctrine to determine something can exist vs. "faith" in something that exists with no basis to draw upon other than some book written thousands of years ago...presumably. That's why it's called "faith".
There are many things we believe in the existence of despite lack of tangible proof.
The Higgs Boson is something that is speculated to exist based on mathematical models and observation of other properties in theory. Therefore they try to "look for it" in order to confirm their models.
Einstein's special relativity was also speculated to exist based on mathematical models. And there was no way to observe that and "prove" that those phenomenon exist until modern equipment was invented...like GPS.
Even when Einstein derived that light travels in "particles", it explained a lot of things, but it isn't really until now that we use "photons" to bombard atoms to do quantum mechanical work...like solar panels. But they were derived to exist based on some other doctrine that works in real life (not just your mind).
There is a line between using an established doctrine to determine something can exist vs. "faith" in something that exists with no basis to draw upon other than some book written thousands of years ago...presumably. That's why it's called "faith".
MacBoobsPro
Oct 26, 10:36 AM
16 cores in 2007
32 cores in 2008
64 cores in 2009
128 cores in 2010
You want to wait 'til 2010 at the soonest? :rolleyes:
4 years. Cant wait. My emailing exploits will just zip along.
How many chips would it span though?
32 cores in 2008
64 cores in 2009
128 cores in 2010
You want to wait 'til 2010 at the soonest? :rolleyes:
4 years. Cant wait. My emailing exploits will just zip along.
How many chips would it span though?
randyharris
Sep 20, 12:52 AM
What most bothers me about the iTV is that it is a workaround to a PVR instead of embrassing it.
I'm looking for an integtated system for music, movies and TV, not just downloading a show as needed, but with the inclusion of a full blown PVR.
I don't think this is too much to ask for.
I'm looking for an integtated system for music, movies and TV, not just downloading a show as needed, but with the inclusion of a full blown PVR.
I don't think this is too much to ask for.
Bill McEnaney
Apr 27, 04:35 PM
No gods exist. There is not a shred of evidence, ontological or otherwise.
Before Anton van Leeuwenhoek discovered bacteria with his microscope, many probably would have insisted that there was not a shred of evidence that any microbe existed.
Before Anton van Leeuwenhoek discovered bacteria with his microscope, many probably would have insisted that there was not a shred of evidence that any microbe existed.