Interested in advertising on Derpibooru? Click here for information!
Help fund the $15 daily operational cost of Derpibooru - support us financially!
Description
Not uploaded yet, time for me to shine.
Help fund the $15 daily operational cost of Derpibooru - support us financially!
Jesus H. Christ.
Well, that was an interesting read, but I don’t think we should shitpost about this any further. This discussion got off the rails quite a few posts ago, with the locomotive is currently exploring the depths of the Atlantic.
Nice talking to you though.
I know quite a few Windows fanboys. You made a blanket-statement that is patently untrue.
I don’t know, bringing usable computing to the masses without a reliance on a command-line interface is a pretty good thing. Apple wouldn’t have accelerated their pace without a competitor, and I don’t think OS/2 would’ve made it far, despite how promising it was.
You’re quite right. My first forays into Windows began with 3.0, long before Windows 95. Before that it was DOS. Toss some Macs into the mix, and Apple IIcs back in the day. I’ve still got a sealed copy of Windows 95C OSR2 somewhere, a rarely-seen OEM edition that differed from B simply with having IE4 preinstalled.
But I do.
I welcomed automatic defrag scheduling because in my experience no one defragged their computers. Even people who were power users, I’d heard excuse after excuse about why they wouldn’t. Some of these people were the kind to fine-tune every single aspect of their system, down to hand-picking components of a video card driver for maximum performance. But ask them about system maintenance and the last time they defragged? “Lol, defrag” was the usual answer. It was boggling.
I got rather tired of sitting down at clients’ computers and seeing the hard drive at 70%+ fragmentation.
And Windows 8’s Metro broke UI/UX design 101. Never make a user switch between UIs. I said they made mistakes. Some more well-known than others. I’ll also never stop criticizing some of the other stupid crap they decided to pull with the UI, and continue to pull in Windows 10.
I was astounded by the fact that Shutdown was hidden behind two menus, hidden behind an invisible corner hotspot. That’s not even being obtuse. That’s being intentionally hostile to your users.
No, Vista was pushed out too early. And it’s poor performance at the time was the result of two things. 1. It was incomplete and unoptimized, slurping down resources it shouldn’t have needed, especially in the GUI subsystem. 2. Microsoft was too lenient on the required system specs for OEMs, allowing them to slap it on systems far below what it should’ve actually been run at minimum, meaning its already unoptimized build was made even worse.
They took Vista, rewrote the pertinent parts, optimized it as it should’ve been in the beginning, and including some new functionality. And Windows 7 was born. Its kernel is still based on Vista.
Heh, that’s an easy way out of the discussion.
Very relevant, actually. In my experience the majority of system instability and outright errors occur to faulty or badly-written third-party software. Have I encountered problems originating from Microsoft and their design? Of course. But it pales into comparison how many times I’ve seen a system brought to its knees if not outright crippled and require reformatting and reinstallation due to a buggy piece of crap software.
What you’re not getting here, is I’m not saying Microsoft is faultless. I know the many, many mistakes they’ve made, and I think a TON of them are really stupid (especially in the last few years with decisions they’ve made with 8/10.) What I’m saying is, despite all the garbage, they aren’t deserving of the scorn as if they’re beating you with a burning hot poker.
I can list a ton of sins Apple has made and continues to make. My favorite Mac bug of all time happened several years ago when logging into the guest profile on a system wiped out all data under the main profile on the system. And it was unrecoverable. Apple listed this as a “minor bug.” They were also charging for OS updates which was a scummy thing to do.
Apple is also refusing to update an out-of-date graphic API, meaning anything which relies on the newer version cannot run. Blizzard encountered this with Overwatch. Despite they and other developers repeatedly asking Apple to implement the updated API, they refused. Apple never has nor ever will give games the respect they deserve, and if it isn’t Angry Birds or some other little app off the iTunes store, they couldn’t care less. Faced with the choice of either not developing a Mac port at all or writing two separate games for two separate APIs, one of which being downgraded to the other, they decided to not do a port at all. Pissing off a lot of Mac users.
And Linux barely has any market share so its not worthy in the eyes of most developers. So claiming Windows is “the best for some things despite better things being out there” is disingenuous.
>It sounds like you’re just parroting popular opinion, no offense.
Well, considering the first words out of my mouth have been “There are no Windows fanboys” and “Everyone knows it’s shit”, you’re quite right, well-spotted. My point was that a lot of people dislike something even if it’s the objectively best choice, and are perfectly justified in doing so.
>Microsoft has made some very big mistakes with different versions of Windows over the years.
What they’ve done right is the shorter answer.
You’ve probably never tinkered with Windows on a professional level, since 95’ days, because most of its horrendousness becomes evident when you do just that. The Vista/7 debacle pales in comparison to the real downhill slide of the system. An average user won’t know about the problems with driver support or ass-backwards interface or the lack of keyboard shortcuts or automatic de-fragmentation scheduling or awful restore functionality and downhill shell design and all the other niggling little things that even the crummiest of sysadmins knows a better solution to.
I still can’t get over the fact they removed the “Move up” shortcut in the shell interface and never put it back in. It’s like blocking the ability to turn left on a steering wheel. How do you even fuck that up?
>The half-baked Vista which was a resource hog (fun fact: Windows 7 is actually a finished Vista, with lots of the problems addressed and an updated Aero.)
I’d go with “Re-branded and put out to sale” rather than “finished”. I struggle to think of any differences between 7 and the last Vista service pack. Same with 8 and 10, save they actually removed a lot of basic functionality and replaced it with spyware. It’s ridiculous. Where’s the middle-mouse-on-desktop to change desktops? Where’s the MacOS dock widget? Where’s the animated desktop? Why can’t I manually re-order files in the shell windows like I could with every previous Windows before Vista? Why are you actively removing useful features? Where’s the million billion interface improvements that already exist in other products? Alright, the “minimize all” button in the corner and flick-to-align-windows is a great improvement, but is that all you can do? This doesn’t make up for all the things you fucked up, Microflop.
Well, yeah: it’s a discussion for the advanced users, and they already know why.
Irrelevant, actually. In fact probably the biggest improvements in Vista/7 over XP is that they de-coupled a lot of the driver interface from system-critical functions, so now a driver or program error results in program crash rather than a BSoD. Usually. That’s how you actually fix the glaring problems, not just pass the blame down the chain.
Edited
It sounds like you’re just parroting popular opinion, no offense. Microsoft has made some very big mistakes with different versions of Windows over the years. The half-baked Vista which was a resource hog (fun fact: Windows 7 is actually a finished Vista, with lots of the problems addressed and an updated Aero.) And Windows 8 will go down in history as having one of the stupidest mistakes for an OS ever (dual UIs, primarily catering to tablets versus the traditional desktop.) But Windows is far from ”shit.“
You’re not giving any reason why Windows is “crap.” I could name a dozen things that are annoying about MacOS or Linux (and its variants.) EVERY OS has its annoyances and quirks. And lots of the problems people commonly cite for Windows “sucking” is actually a user or hardware error. If you decide deleting System32 is a good idea because you think you need every last byte of your hard drive free for your Solitaire game, or if you think buying the cheapest OEM system at Wal-Mart with 2GB of RAM and Windows 7 is a good idea, yeah, you’re going to have a bad time.
There are millions of system configurations out there, and you’re always going to find something that doesn’t work exactly as it should. And since Microsoft doesn’t control every single piece of software released for the system, you don’t have the same standards applied to everything. A crappily-written program by a crappy dev should reflect on their abilities, not Windows.
Edited
Oh come on, you’re missing the point: even if Windows is better than smashing your face across the DOS prompt and an objectively better choice than Mac OS or the inscrutable Linux distros for many tasks… it may be better, but it’s not good.
It’s like saying asthma is better than lung cancer. You’re technically right… but you wouldn’t want either one.
Basically: Windows is asthma. Not quite cancer.
Same goes for many console manufacturers, software companies, game studios, etc., etc. It’s all relative, but that doesn’t mean the lowest bar of acceptable quality is suddenly great.
Sony and Microsoft have had mature online platforms for two generations before Nintendo actually fully-committed. Being free has nothing to do with it, nor is it completely free. Microsoft still charges for XBox Live Gold. The Dreamcast required a dial-up ISP to use online, it wasn’t until later that they decided to do a monthly fee for access to their own ISP and servers. Sega had been struggling for multiple systems. The Sega CD was a flop. The 32X was a flop. The Nomad wasn’t widely adopted. The Saturn was a flop. The Dreamcast was their last effort to remain competitive in the market, and primarily two things killed them. 1. Barely any advertising, especially post-launch, for the system or the games. 2. Hype from the upcoming PS2.
It wasn’t because “the internet wasn’t free.”
Nintendo’s had plans for online connectivity as far back as the SNES (and the system received a Japanese-exclusive addon called the Satteliteview, which was a satellite-based modem that allowed users to download content, and browse certain things.)
The 64DD also included a modem and did have internet connectivity for the short time it was active. It also cost a separate monthly fee.
The main reason Nintendo didn’t adopt online connectivity sooner was because the president of Nintendo thought the internet was a fad, and said he wasn’t interested in it. So their plans were touch and go. Some info would come out, then it would never happen.
Nintendo’s delayed acceptance only hurt them. Sony and Microsoft have their own dedicated network with their own servers. And each publisher has their own servers. Compared to Nintendo, who used GameSpy’s network solely to host ALL online play for all of their systems prior to the Wii U. And now every single online-capable Wii and DS game can no longer be used online, after GameSpy’s shutdown. Nintendo shut down the entire WiiConnect24 service. You can’t even send or receive emails thru the Wii anymore.
Edited
You’re assuming no one likes Windows. You think it’s shit? You weren’t around when command-line interfaces were the norm. You double-click an icon to start something. You don’t have to type “C:\directory\subdirectory\program.exe -commandlinearguments” to run something. You don’t have to fiddle with creating autoexec.bats to change your memory use for specific games that required more extended memory. You don’t have to mess with IRQs.
IBM created the first working GUI, Apple copied it, and Microsoft copied from them. Further refining it as they went. Versions like Vista and 8 are crap, sure. But that’s due to stupid decisions on Microsoft’s part, especially concerning the UI with 8. On as a whole Windows made using computers far easier.
If you think Windows is “the worst”, obviously you don’t have much experience in other OSs.
That’s all true.
In defense of the haters though, you can still dislike something even if there are worse things out there. I mean, look at Windows: there are no Windows fanboys. It’s shit. It’s always been shit. Everyone knows it’s shit, and (almost) everyone uses it because it’s the best (and sometimes the only) way to get things done on a PC. It’s the worst but, unfortunately, it’s the best.
@Beau Skunky
>To be fair, they also created things that eventually their competitors did, as well. (Like motion control.)
That’s one invention we could have done without.
To be fair, they also created things that eventually their competitors did, as well. (Like motion control.) As for online gaming, I’m kinda glad they waited until it got more advanced, so it works much better, and we can do it for free now. (provided we have our own Wi-Fi and our own internet service.) If you look back at early online gaming, it was rather clumbsy & costly on consoles like the Sega Dreamcast. (One of the reasons Sega had to drop out of the console market, as it proved to costly for them.)
Also, actually Nintendo had plans for early online systems, like with the 64DD, and at one point the GBA, as well. So they’ve actually been testing waters with the internet for quite awhile, despite it taking a long time to get into the online gaming scene.
They also took Galoob to court over the Game Genie, and lost.
I’m a huge Nintendo fan, but they’ve done a lot of crappy, anti-consumer things over the years. And I grew fed-up with their unwillingness to support online anything because the president of Nintendo in Japan thought online functionality was a passing fad, and he wasn’t interested in it.
They’re so slow adopting it, that by the time they actually do something, their competitors have been doing it and have so for years. Online play? Downloadable titles? Accounts connected to actual accounts instead of to the console hardware?
When they first started doing downloadable demos finally, they even did it in a stupid manner. Putting titles up for download only for two weeks before pulling them.
Re: Rental stuff. Blockbuster also rented out Virtual Boys. And there was another title that was almost exclusively rental: Indiana Jones and the Infernal Machine. That one was a little different, tho. Since it was so close to the end of the system’s life, LucasArts felt that a retail release would be a waste of money. So they took preorders for it via their own site and partnered with Blockbuster who also offered preorders on their site. This was the only way to obtain a new copy of the game. It never hit retail shelves. Anyone who didn’t preorder it only got to play it by renting it from Blockbuster or buying a used copy from them when they cleared out old stock.
Still, they’re nowhere near as bad as companies like Konami or EA who treat their employees & customers like crap.
Nintendo’s not perfect, (made some mistakes) but overall they’ve done more good for the gaming industry then bad. Pus they abandoned their old practices, so I see no reason to hold a grudge. (Not like Sony & Microsoft are saints either. Especially, the early X-Box One controversies.)
One thing I will admit was jerkass on their part was when they tried to make it illegal in America to rent video games. (Like it was in Japan, and some other countries.) Even trying to hinder the rental market by limiting the number of games people could buy, or sueing Blockbuster Video, for making photocopies of their manuals.
Ironically, later they actually started to support rental like when they were allowing Blockbuster to rent entire N64 consoles, and allowing some rental only games to exist. (Like “Clayfighter 63: Sculptor’s Cut.”)
Ex-NOA prez Howard Lincoln (not to be confused with Nintendo Fun Club’s Howard Philips) also tried to divert the “violence/sex in games” controversy of the ‘90s by shifting media blame to Sega’s “Night Trap,” which was rather underhanded as according to it’s development team, Nintendo & especially Howard Lincoln was the one whom originally was supporting said game, and it was going to be for the (cancelled) SNES CD-Rom upgrade originally.
Got to say, I’m glad he’s out of NOA, and we got Reggie instead now.
Edited
It’s been answered in more detail already, but the short version is: Nintendo is excellent at making games and consoles, and awful with their underhanded and abusive business practices.
That’s not how Nintendo worked in the NES and SNES days. Becoming an official developer meant signing a contract that stated you would purchase the carts from Nintendo. The resentment this caused due to Nintendo double-dipping is well-known, and well-documented. Books have even been written that go into Nintendo’s business practices, both good and bad.
They may have had some special deal with Rare, being Rare, but that absolutely was not the norm for third-party developers.
Actually, according to Grant Kirkhope, (music composer of “Banjo Kazooie”) Nintendo actually payed THEM for each game cartridge made, so game developers still profited even on unsold games. So it’s actually the opposite, as Nintendo was actually quite generous to their second & third party developers.
It’s a common misconception that E.T. led to the video game crash of ‘83. It didn’t. The market was already going downhill before it was even released. The problem was oversaturation of terrible games. Companies were trying to make a quick buck off the craze, and released whatever garbage they could. Eventually consumers had enough, and decided they wanted nothing more of video games. The E.T. game certainly helped this, but it was not the sole cause of the crash, like so many people blame.
This is partially why Nintendo enforced quality standards with their games, and required each officially-licensed one to be reviewed and get their stamp before release. The other reason was because this let them profit off the third-party developers and games (requiring them to sign contracts that forced them to purchase the carts from Nintendo.) And let them limit how many games a year could be released per developer, or shift around release dates (“Mario 2 will be released in the same month as your game. Move your game back a month.”) This led to some developers spinning-off labels that were “technically” not the same developer, and letting them get by the annual limit. That’s how companies like Ultra Games came out of Konami. This let them release 4 games per year instead of 2.
This would later go on to hurt Nintendo, as in the Nintendo 64 era, when developers were fed up with Nintendo profiting off them by charging for each copy manufactured, and disappointed with their decision to stick to a cart-based format (itself to continue forcing developers to purchase from them, and to fight piracy) they decided to jump ship, and go to competing systems like the Saturn and Playstation.
But yes, lots of people today don’t give Nintendo the respect it deserves. Without them, the video game market as you know it today either wouldn’t exist, or would exist in a vastly different form.
Yeah, that’s what I was referring to actually. Nintendo helped make video games popular again, and more of a mainstream place in pop-culture.
I know that your comment is old, but I would also like to point out that Nintendo pretty much saved the video game industry. The infamous ET Atari Game had almost killed it, but Nintendo stepped in with the NES and made Video Games popular again. So basically, if it wasn’t for Nintendo, there probably wouldn’t be any more video games in general. No Sonic, Duke Nukem, Dark Souls, etc.
Yep, Nintendo did invent/popularize the D-pad for games. Before that people used awkward huge Atari joysticks, and such.
They also later popularized the smaller thumb-sized, pressure-sensitive, analogue-sticks for controllers with the N64. As well as controller accessories like the rumble pack. (Which other consoles adapted to afterwords.)
My point was Nintendo “isn’t the best” (that’s entirely up to personal opinion) but they contributed ALOT to gaming, and the way we game nowadays.
And let’s not forget that the d-pad as we know it came from-
And the idea of the DS even.
I hate to sound like a fanboy, but I honestly don’t get all the Nintendo bashing. I can understand prefering other consoles/games, but Nintendo’s been making games, and consoles long before Sony & Microsoft have, (and basically saved/revived the gaming industry from the “1983 market crash,” and such) so I think some gamers should be a li’l respectful of Nintendo. Plus, they’re the only console-maker that actually make some of their own games for their consoles. (NOT bashing Sony & Microsoft, mind you.)
It’s especially ironic when Sony-fanboys bash Nintendo, because if it weren’t for them, the Playstation never would’ve been made. (Which was originally going to be a CD add-on for the SNES.)
Like ‘em, or not Nintendo’s still a big part of gaming culture, and contributed alot, and inspired good aspects for Sony & Microsoft’s systems.
Hope I don’t start an argument over Nintendo posting this… Just my 2-cents, sense I get sick of the fanboy bashing.
I didn’t mean it as something against you, just pointing out it was posted, anyway you got more of the info of the twitter page where as the other was from the page but was just the image.
I just wanted to thank you for siding with me, that is all.
Lots of substance, they focus first on enjoyable gaming then trying to cram all other media into it’s system, Sony make a balanced of the two, and Microsoft tries to force so much of other media(mostly social stuff) into it’s stuff it’s tends to mess up and have to be corrected by others.
Nintendo only knows how to look good. What substance do they have anymore?