Jump to content

doomlordVII

Members
  • Posts

    24
  • Joined

Reputation

10 Good
  1. Sorry, but the official numbers are that Mac OSX share is about 6%, +/- about 1%. http://en.wikipedia.org/wiki/Usage_share_of_operating_systems http://www.netmarketshare.com/os-market-share.aspx?qprid=9 Regardless of source, the general consensus has Windows with OVER 90% market share. Now, as we know, gamers dont even come close to making up half of the world's computer users. Maybe 2/10 computer users play games. So that means globally 18% of Windows users game, and 2% of Mac users game. Regardless, the audience on Windows is over NINE TIMES greater than on Mac. Then add in that most gamers are running Windows in one state or another (multi-boot, boot camp, etc.), and that share for Mac shrinks even more. Also, Im willing to bet that "random tech support guy" probably has access to general Blizzard/WoW stats. Chances are he pulled those up and saw that of all their clients out there, only 6-8% are running macs. Heck, it doesnt even have to be user stats. He probably just had to look at the percentage that pull down Mac patches. Fact is, its just not profitable. Would it be cool for all games everywhere to come out on every OS? Yes, absolutely. The thing is, the vast majority of games use Direct-X as a render method. Now, Direct X is a Microsoft platform, so its not going to run on Mac. Now, lets look at the engine SWTOR uses: the Hero Engine. Officially, it ONLY supports Windows. Why? Because it uses Direct X as a render method. The funny thing is, Bioware didnt make this engine. So to get it running on Mac, they would need to either re-write the ENTIRE engine to run on OpenGL (not easy to do a conversion), or get a new engine that runs on Open-GL (also not easy). Either way, the insane amount of effort needed to get SWTOR to run on Mac just isnt worth it. If you're gonna game on PC, you're gonna need to run Windows. Thats just how it is. And what ways would that be? Through some of those terribly restrictive, closed-source "iSuite" programs?
  2. KoTOR wins everything Best story Best gameplay Best voice acting (NOTHING touches Kotor) KoTOR 2 is just bad compared. It was rushed, so its story made very little sense, and in general it was just less polished. It could have been good, but it wasnt. I'd but it after SWTOR. Its not TERRIBLE, but its just not KoTOR. Basically: KoTOR > SWTOR > KoTOR2
  3. It actually used to work with the hud up. It happened after the 3rd last beta weekend, when wearing a hood allowed for a mask. In the last beta weekend, it magically changed to hood OR mask, but not both, which is what we have now. http://img442.imageshack.us/img442/8201/hoods.jpg
  4. Yes, its DX9 based. Basically, bioware would need to re-QA a new engine, or a new version of the engine (to make sure ALL the assets work in OpenGL), then re-release the game. ANY custom shaders would need to be re-made, as the DX path is entirely different than OpenGL. Meanwhile, with Windows you can target 90%+ of computer users (not all are gamers, but still), while with Mac you can only target between 3 and 6%. The market share isnt there. It makes no sense. Meanwhile, you could just boot camp into Windows, and use the same version of the game. It costs the devs no money, and requires zero effort on their side. Which sounds better? LOL. My i7 2600k @ 4.8ghz (OC), 8GB DDR3 @ 2000mhz, GTX 570 SLI disagrees. So does my laptop (i7 740qm, 8gb DDR3, GTX 460m), my 2nd rig (i7 920 @ 3.6, 12gb DDR3, GTX 295), and my 3rd rig (i7 930 @ 4ghz, 6gb DDR3, GTX 260+). All built by me (well, the laptop is a SAGER). Try running Cinema 4D (rendering, all 8 cores), a few games, photoshop, 8 tabs of internet, some blu-ray movies, SWTOR, Cryengine 3 AND music all at the same time. Why? Because I need to model assets for a game in CE3 (C4D for model, PS for textures), movie for reference, SWTOR because i play games, music because the movie is muted, and internet because I read forums, news, etc. Good luck. Macs and PC use the same software for graphic design. Lets say your making a movie. You'll probably be using Photoshop for textures, 3DSMax (or maya, or C4D) for modeling. These programs run EXACTLY the same on both OS. So it comes down to hardware. Apple is FAR behind on hardware. They're running 1st gen i7's (2008 hardware), that CANT be overclocked, while PC is already on 3rd gen (almost 4th, cant wait for ivy bridge). The ram is at BEST 1333mhz, while PC you can get some insane 2333mhz or 2400mhz ram. Also, its dual vs. quad channel, so you really cant win the bandwidth argument either. The GPU is at BEST a ONE 5870. Meanwhile on PC you could have 4 7970s (or GTX 780s when they come out, or GTX 580 3gb models). Not only are you limited to ONE card (or two.. if you get 5770s), but you're TWO generations behind. So the whole "better for graphic design" is just a terrible, terrible argument that shouldn't be used. 2.4ghz vs. 2.0ghz on low-load programs. You'll be CPU limited, so the 20% higher clock is obviously going to give better results. Go find a review of the i7 2770k, 2600k, 2500k, 3990x or any of those CPUs in a Mac vs. PC environment. OH wait, mac doesnt support anything past 1st gen.
  5. The thing is, the ultra-high end enthusiast doesnt care. Those with beefy enough rigs should be allowed to access the maximum settings. Yes, it wont be usable by most people, but thats not the point. Some people drop thousands on rigs with very high end parts (my rig is pretty good, and not even close to the best available). When im playing SWTOR, im using about 60% of the VRAM, and getting 100-200fps @ 1920x1200. As the textures exist, we should be able to enable them via some method in order to be able to fully use the computer power available. It doesnt need to be via in-game settings: there would be too many complaints on the forums about low FPS/crashing due to insane VRAM usage and so on. Simply add a config line like "texturequality = 3" to represent 'high' settings (current high is really medium). That way those who know what to expect have access, while those with no clue about the settings wont accidentally get 5fps. Look at Crysis (I will never stop using it as an example). It came out and destroyed EVERY SINGLE computer available. It just couldn't be maxed out. Most played at medium settings and got 30fps. However, there were still the few with Quad Core CPUs overclocked to 4.5ghz, 8gb ram and quad-SLI 8800GTX setups that would end up writing custom config files in order to surpass the maximum settings. Years later, with most high-end rigs getting 100fps or so, most use extremely high configs that surpass the default "very high" by a lot. And yes, I KNOW its possible. Entities in cut scenes are flagged for high-rez textures, and then de-flagged once the cut scene is over. Knowing that, it shouldnt be too hard to just have that flag set to 'on' all the time. That, or someone will simply find a way to force it (replace medium texture files with the high ones, etc.).
  6. Did you, or did you not run FurMark, or one of the 3 CPU stress tests? Until you run either one of these, ALL your complains are invalid, as its likely a bad cooling solution. Yes. Liquid cooling CAN be bad. Improperly applied TIM, terribly done loops, hell, even bad blocks. The FPS drops in the station because the game uses TWO threads. Its a CPU bottleneck. I cant explain this any simpler. CPU bottleneck = GPU waiting on CPU = Lower FPS = Lower GPU Load = Lower GPU Temp. CPU bottleneck = More CPU Load = Higher CPU load = Higher CPU Temp. If you overclock (if you want to), you'll get higher FPS on the station. Im currently at 4.8ghz, and I can FINALLY get zero FPS drop when walking around on the station. Yes. The engine is badly coded. However, high temps are NOT a result of bad coding. High resource usage combined with bad cooling IS. @ your Edit Yes, I know some cards overclock tons. Others DONT. For example, I used to have an 8800GT a few years back that couldnt get its Vram overclocked by 50mhz. Its called a bad yield. This is why some CPUs of the same series can overlock extremely well (called a golden chip), while others overclock extremely badly. The reason cards CAN overclock fairly well is simply for a 'safe zone' for yields - NOT temps. IF they set the default clocks fairly conservatively, they will have a higher yield of chips which can run at that speed. If they set the default clock speeds to 90% of theoretical limit, their yield might be 20% or lower. That would be terrible business. There is no such thing as a 'faulty' engine. There are badly coded ones that cant multithread, have memory leaks and so on, but thats not a 'faulty' engine. Run IBT, Lynx, Prime 95 of Furmark. These are all standard tools DESIGNED to push your stuff to 100% load. The CPU ones put 100% load on ALL cores, with Prime 95 putting tons of load on the RAM controller as well. Yes, its an unrealistic scenario, but its a GREAT stress test that ANY stock system should pass just fine. My bet is that if you run furmark, IBT, Lynx or Prime 95, after 5-10 min you'll see CPU and GPU temps WELL over what you see in SWTOR. Nope. Nvidia's "max safe" temp is 105c, which is conveniently where the GPU automatically throttles off. Im not sure about ATI, but its probably similar. CPU's have at thing called "TJ max" which is basically the temp at which the CPU will start to throttle down. On an i7 series CPU, its about 98c. Again, not sure about AMD. 50-60c temps on GPU or CPU core are 100% safe. They are designed to operate under those temps. Will it shorten life span? Yes. By much? No. It might make your CPU die a year earlier, but if you run your CPU at stock speeds (stock voltage), have good cooling on it and so on, there's no reason it couldnt last 10-20 years. Chances are you wont be using the same CPU in even 5 years, so its really an irrelevant point.
  7. Holy gawd this guy is bad. 1. system overheating is caused by two things: heavy load and bad cooling. ALL computer parts are designed to cool themselves. At stock, a core i7 2600k will run fine at 100% load with the stock cooler with ZERO problems. Same goes with ANY GPU on the market. So that means that the USER has done something WRONG: i.e. not enough case fans, too much dust in the heatsink, etc. Run FurkMark for GPUs, and watch your GPU temp. You'll see them go MUCH higher than in. SWTOR. Run Intel Burn Test, Lynx or Prime 95 for your CPU. You'll also see the temps go much higher than in SWTOR. 2. That was an error in the Nvidia drivers a while back that caused the fans to be stuck in idle. That was caused by DRIVERS. Not a game. They fixed that. If you're using those drivers: USER ERROR. 3. Fans are controlled by drivers. Only programs with driver level access (overclocking tools) can alter the fan speed. NO GAME CAN DO THIS. Yes, if the fan is at some low speed and the GPU is at 100% load, it can be fried. However, to do that you need either bad drivers, a bad card, or user error in overclocking software (or terrible, terrible GPU bios). 4. No. Thats just wrong. All parts can run at 100% load. 70% is just the max on fan speed (or 80%, depending on the card) when manually setting the fan. They can still reach 100% fan speed, but ONLY when the temps of the card exceed a specific temp. They cap the manual fan speed at 70% or 80% to prevent noobs from running it at 100% 24/7, and then burning out the fan motor. Because. All reference parts are almost identical. If my GPU idles at 42c (which it is, right now. Card #2 at 38c - stock coolers), and your's is idling at 60c, you have something wrong with YOUR setup. Either bad fan speed settings, bad cooling (im using an 800D, i.e. a terrible air case), or dust in the heatsink. All this talk of "software that tampers with your system" is 100% bull. The ONLY thing that can do that is some form or terrible virus (idk if that even exists), or software specifically designed to do that (i.e. overclocking software like MSI afterburner, RivaTuner, etc.). The game DOES NOT control GPU usage, GPU fan speed, CPU usage or CPU fan speed. It is given a set of parameters (run game at X rez, Y settings), and SOMETIMES is given an FPS cap (usually 60fps due to vSync). It will use the computer accordingly to fulfill those settings. If the computer is 'imbalanced' (i.e. good CPU, bad GPU), the CPU will spend time waiting on the GPU to finish rendering a frame. This in turn puts MORE load on the GPU, as IT has nothing to wait for. If Vsync is ON, and the card is rendering at 60fps, it WILL however wait for vSync. If Vsync is off, and you're getting 80fps, the card will CONTINUE to render at 100% load, simply because it has been told to render as fast as possible. You can flip that around with the CPU as well, where the GPUs wait on the CPU, causing heavy load on it. This can cause the same problem. BEFORE YOU REPLY, RUN FURMARK, PRIME 95 (Or IBT, or Lynx). THEN POST YOUR TEMPS AFTER 5 OR 10 MINUTES
  8. Holy wow. People here are computer noobs. A 2600k, especially OC'd, is strong enough to remove the CPU bottleneck this games likes to give people. As most people know, you will always be limited by the slowest component - in this case, its your GPU. Without vSync, or any other FPS limiter, you're telling your GPU to render as much as possible, as fast as possible - i.e. USE 100% OF THE GPU. There is NOTHING wrong with what you've said. Its SUPPOSED to be running at 100% GPU usage. 100% GPU usage causes a LOT of heat, which will require a lot of fan. Anyway, @ OP. The reason comps are "dying" is because of BAD COOLING. A game, or any software (that is NOT overclocking software) CAN NOT KILL YOUR COMPUTER IN ANY WAY, SHAPE OR FORM. IT IS PHYSICALLY IMPOSSIBLE. IF your hardware cannot run at 100% load, this is because it is INCORRECTLY SETUP. I.e. USER ERROR. i.e. BAD COOLING. Go download FurMark and run that for 30 min. I guarantee you it will run your GPIFAR hotter than SWTOR. Download Prime 95, Intel Burn Test or Lynx, and you'll get the same thing on your CPU. Please, BEFORE posting about how "SWTOR killed my computer" or something, think for a second! If your computer is running too hot, run some benchmarks (hell, 3DMark 06/Vantage/11 are all good, and free). You WILL (100% sure) see your temps are high, and that means there is something wrong with YOUR computer. Before someone goes "this guy is a noob", go look at my rig. My rig
  9. While I do agree with this (actually, more like '04-'05), there's no way Bioware could have gone with up-to-date graphics. Currently we have a fair number of people complaining about low FPS - imagine if the game was made on (example) Cryengine 3? People would complain that on ultra-low, they couldnt get more than 20fps on the 8800GT and dual core CPU. Also, the up-to-date graphics get out of date insanely fast. Look at CoD4, which people used to say looked "awesome". Its terrible now. Same goes with a lot of other games. They look great for a while, but as soon as something better comes along, its hard to compete.
  10. Yep, the game CANNOT use more than 4096mb (4gb) of ram. Its physically impossible. That might be your TOTAL system ram usage, in which case there is still something wrong. Check for running processes and so on. As for the CPU, there's little that can be done. The 100% usage is a bit insane, and if its not some other process hogging system resources, you've basically got two options: 1. Overclock like mad (if its a 2nd gen, you could get it to 4.5ghz, 1st gen probably 4.2ghz) 2. Get a new CPU (if its 2nd gen, its LGA 1155 - you can go to an i5 2500k, which is great. 1st gen you could get an i5 or something ).
  11. Oh wow. I love the people complaining about all the tech issues, when in fact its literally ALL user-error. Low FPS? Your rig isnt up to the task. Drop the settings, overclock, etc. I couldnt run Crysis on release either - I had to run at medium @ half the resolution I normally play at - I didnt complain though, because it was my fault, not the devs. Jumping around a lot (i.e. lag)? Get better internet, or go on a server thats closer. Its perfectly played will ~110 ping, and great with sub-100 ping. Go to pingtest.net and test out how reliable your ISP is. Comp shuts of randomly (lockup, etc.)? Definitely your fault. Drivers might be bad, might have a bad install (stuff has gotten corrupt - rare i know), might have overheating parts, your overclock might be unstable (mine was - dropped my GPU by 20mhz, all fixed). Might even be bad hardware (i've had one of my HDDs die on me). I cant click on X mineral! So, what your saying is that because the devs forgot to re-flag a few out literally THOUSANDS of items in the game, which are non-game breaking, your willing to cancel? Games have bugs. Software has bugs. You should see some of the crap beta testers in other games have to deal with (having to MANUALLY dump shader caches). My BH ran into a game-breaking bug! Ok, this IS a valid complaint. Being a key story element, players are relying on it to work. This type of stuff should have been caught in the beta, and might have been, but was missed by the devs. My only statement would be to wait, but I know some of you might not be willing to do that. There's not a lot a user can do here. Basically, unless the error is "Quest X CANT be finished due to game-breaking bug", you have no reason to complain. Broken side-quests are to be excepted in a game this big. I've seen a few cut scenes which have no audio. Crahes are user error. FPS issues are 99% of the time user error (in some cases the devs simply overloaded an area with draw calls, or didnt flag for culling, memory leaks, etc.).
  12. Go with whatever the GTN recommends, then round up to the nearest hundred. So if it says "X is worth 432 credits", sell it for 500. Dont go much higher than that yet, as the markets haven't fully reached their potential. Im sure the 'market' value will be closer to 2x recommended.
  13. Because, when you reach level 50 with your 400 crafting skill, you will be making ~15k an hour. Meanwhile I can craft ONE item and sell it for twice that in half the time, using only of my 5 companions. Crafting = late game money Slicing = early game money
  14. I dont know what you're complaining about. When I play my KD is usually around 25:4 or so, and im usually within the top 3 for damage. The sorc is probably one of the BEST PVP classes.
  15. Wow I want to play this FP right now. Although yes, insta-gib sounds a bit broken.
×
×
  • Create New...