Funny thing is actually no developer has said the tools suck or the tools are a pain. However many have stated eSRAM is a pain, from there this idea about tools being bad has taken on a life of it's own. First it was "final API (which has since turned from API to "tools") hasn't been added to XBO yet" to Whitten making a statement via Major Nelson's podcast that they have pushed their "mono driver" (read: their XBO API that would replace any "broken" API they were using previously) which is their low level thin API to developers. From there more sites started talking about these "tools" that aren't as good as Sony's (which vast majority of said speculation coincidentally started after the consoles launched and those pointless graphics comparisons started surfacing, but I digress). This was run through the banana phone and became "tools are very bad" to "tools are just broken". Meanwhile you can actually find very little information on what "tools" exactly are "broken". Their debugger? Their IDE? Their compiler? Their code editor? (This is largely rhetorical since MS' IDE/tool is the visual studio suite and it's anything but "broken") Their Xbox One API? That wouldn't make any sense with what we have been told by MS themselves.
In case people forgot what MS stated about their XBO API. "Since E3, an example is that we've dropped in what we internally call our mono driver. It's our graphics driver that really is 100 percent optimised for the Xbox One hardware. You start with the base [DirectX] driver, and then you take out all parts that don't look like Xbox One and you add in everything that really optimises that experience. Almost all of our content partners have really picked it up now, and I think it's made a really nice improvement."
Neither Nvidia nor Intel would be privy to any knowledge about Xbox One development environment to make such a statement in regards to Xbox One and Direct X12. Not that it would matter anyway, it's clear Nvidia and Intel are talking about PC in regards to Direct X12 because PC is the main platform that suffers from underutilization of CPU resources comparative to GPU resources due to overhead that doesn't exist on consoles (since it's a single hardware target platform vs a platform with myriad of configurations ontop of driver overhead out the ass). Why do you think during the DX12 discussion panel at BUILD MS barely said anything about DX12 and how it relates to XBO but instead spent 99% of their time talking about the savings it brings to PC development? Hell at one point they even directly compare DX12 to console API because that is the goal of DX12 to bring a low level API with smaller overhead to PC making it easier to get power out of the PC platform and port games between XBO and PC. During the presentation MS makes note that much of what is new to DX12 was already present in XBO's API. XBO will benefit from bundles (Crytek rolled their own solution IIRC for Ryse) but bundles aren't doubling console performance; developers really aren't complaining about horrible CPU performance with XBO; nor is CPU performance the reason XBO versions of games aren't hitting 1080p like PS4 versions (since they use the exact same CPU down to the number of cores available). Incidentally right after BUILD all kinds of hilarious hogwash about a 2x graphics increased surfaced from all corners of the woodwork (Yes I'm referring to Wardell's article over at Neowin). People looking for confirmation bias championed his article (which he as pretty much back away from if you follow him on twitter); then throw in a few developers who are completely baffled by said article pretty much makes that entire thing a lulz event. I mean really if DX12 was going to provide a 2x graphics performance increase you'd think MS would have mentioned that...but they conveniently forgot to for some reason. They stated there would be a performance increase but not once did they quantify said increase, and for good reason, because adding a few CPU targeted improvements to your API doesn't magically get rid of a hardware deficit (that being less ROPs, ALUs, SPUs you know the things that actually do directly affect resolution). Nvidia and Intel made a statement because things like Mantle/RyuJIT/DX12 are doing what consoles have been doing since forever; low level abstraction, culling unnecessary overhead, better CPU pattern usage, better communication between CPU and GPU, etc. All of that is a staple of console environment by default.