well what you need to do is if you see 1080p and want 1080p you type in half that and go to 720p. so if you wanna game like a real grown up sensible person with a awesome big screen HDR TV with thousands of nits of HDR for the same price with far better picture quality than your cut in half 144hz monitor. well you may wanna try lowering down to 720p because its half the height for those 144hz displays. If you've an ultrawideload of cut vertically in half monitor for twice the price with the letters spelling gaming on the box that nvidias not even half a graphics card sold you. Heres a config file i use myself but its me blindly guessing by what i feel 'looks nice' The mathematician stating infinity on the box backed by advertising laws means your infinitely better hardware is the best thing ever. but i mean yeah a DSLR camera can capture like hundreds FPS in highest quality 10bit raw uncompressed why cant your way more expensive and super advanced graphics card they use the tiniest oldest sliver of for HDR 10bit in all DSLR cameras since digital photography was invented AMD modern better than super computing graphics be capable of that stuff? so yeah you gotta type stuff in and turn them on. you can realtime encode hevc video as you game of it in 12k too. so umm they're really stopping it from working ever and making it really hard to use but with all the photographic things if they screw it over all your youtube and netflix and hollywood and bluray discs and TV and webcam feeds and stuff and all video content gets completely ruined even while using truevisualnext trueaudionext truegraphicscorenext truevideonext and enabling all the infinityfabric and cache and SIMD and SAM and worldspacereservoirs and converting my realtime professional transparency ray cast ultrarender like in pro hollywood animation graphics like blender or whatever as i game in billions or octillions of render resolution or trueresolution toultra12K so that the video standards allows better bitrates for video and audio to my hardware as it would if you play the same video off youtube in 12k or 1080p you notice the same 1080p audio sounds way better when its played back in 12k mode but the audio isnt any different. ever since about the playstation 3 era is when they were world record holding super computers and then surpassed those with zencore infinitely. you should also use photography terms and modes because if they cripple your infinitely better than any other super computer beyond supercomputing realm AMD devices like LG V60 phone or anything ryzen or radeon the last decade or two. I custom gain and saturation and contrast and brightness and whitebalance and other things with FULLCOLORVOLUME and ultracolor or supercolor or optimized color and use TV broadcasting colourspaces those were less crippled by microsoft and nvidia and intel. I also like to use gamma levels of 3.7 and gamma ramps of 1.9 ? modern CPU's and things have the hardware security for streaming 4k netflix and things in your system and watermarks and stuff NFC radio tags or whatever so you dont need to worry.Īfter that you can consider making a text file and rename to config.ini and enter like 30fstops dynamic range or whatever the human eye sees in 15 or so best DSLRs from sony use like maybe 19 or 23 or so in Slog? it looks crap compared to my uncompressed raw DSLR quality video game pro render of my games on my mobile phone or 5700xt. You will need to set it to full RGB 444 12bit ScRGB in adrenaline consider disabling HDCP if you dont have a bluray disc drive player ROM for your PC theres hardly any protected content that requires it when viewing streaming media or anything NOT a hollywood bluray disc format but the HDMI stuff protection uses half your bandwidth up in noise to uhh security your cables. its such a small percentage of the population 10bit panels is all humans ever need. theres a few mutant female exceptions with extra X chromosome marker but those eyes see in abnormal color anyway and is maybe a type of colourblindness as our brains cant process that much. So you got like (R)X(G)X(B) so 8bit is 16million colours 10bit is 1.2billion your human eyes see in 300million colours. So its per R G B channel which are additive and multiplicative so all possible possibles. What this does is, it means your fake nvidia fake hdr and its 8bit trash which uses 0-255 (8bit 256 values for all its overlays and reshades where it 2d rasterizes and fakes 3d stuff and fakes lighting and all sorta stuff) will then allow sending data to the display in full 1024values 10bit format. and at the very bottom tiny letters click the ADVANCED > which will branch out scroll right down the bottom of that, enable 10bit HDR display panel output. In adrenaline software under the settings cog top right you must then click GLOBAL GRAPHICS page.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |