|
Post by itsnoot on Sept 14, 2017 22:03:18 GMT -5
24 vs 60 was only a debate because of the parity involved with 24fps film source material being played at 60fps. In that circumstance frames are dropped to find a common denominator. This caused fast paced scenes to appear choppy and the action difficult to follow. Panels capable of displaying 24fps were very desirable and carried a premium a few years ago. No idea about today but everything new is digitally sourced so its less of an issue Id think. Bottom line: The action scenes at 24fps source become smooth and easy to follow played native even at that low a refresh rate.
The 720 vs 1080 difference does become negligible at a given distance and screen size. There's a graph floating on the web that shows it. It's like a 50ish inch panel at 10ish feet. Personally I think that's only raw resolution and doesn't take things like aliasing or super fine background details into account. The same logic should apply to 4K but the difference there is obvious at longer distances than 10 feet.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 11:37:33 GMT -5
24 vs 60 was only a debate because of the parity involved with 24fps film source material being played at 60fps. In that circumstance frames are dropped to find a common denominator. This caused fast paced scenes to appear choppy and the action difficult to follow. Panels capable of displaying 24fps were very desirable and carried a premium a few years ago. No idea about today but everything new is digitally sourced so its less of an issue Id think. Bottom line: The action scenes at 24fps source become smooth and easy to follow played native even at that low a refresh rate. The 720 vs 1080 difference does become negligible at a given distance and screen size. There's a graph floating on the web that shows it. It's like a 50ish inch panel at 10ish feet. Personally I think that's only raw resolution and doesn't take things like aliasing or super fine background details into account. The same logic should apply to 4K but the difference there is obvious at longer distances than 10 feet. I remember that. It (the feature on tv sets) was called "24p" The tech advances so rapidly, I wouldn't be surprised if 24p has already become obsolete and they're now using computer processing & algorithms to display the 24fps (film source) smoothly on 60/120/240Hz TVs.
|
|
|
Post by Coolverine on Sept 15, 2017 12:06:42 GMT -5
I remember when the console lamers used to claim that they couldn't tell a difference between 720p and 1080p, or between 24fps and 60fps. "they couldn't tell a difference between 720p and 1080p" I remember a Best Buy employee telling me that before and thinking to myself how untrue it was. Well, they could be right in extreme instances... such as, if you're gaming on a very small 20" TV and distance between your eyes and the screen was like 10ft (or greater). lol I always thought the difference between 720p and 1080p was significant and easily noticeable, and a huge upgrade in image quality. "or between 24fps and 60fps" Yeah, that's bologna (if you're normal)... It's possible the people who've made this claim have something wrong with the visual processing center in their brains. Peoples' brain activity cycles at different speeds, from one person to another, for biological reasons (genetics, disease/brain injury, age, etc.). Perhaps it's akin to color blindness (from their point of view, there really is no difference between reds and greens)... but in this case, some peoples' visual systems are incapable of working fast enough to detect a difference between 24fps and 60fps. This topic reminds of the movie "Dredd" and scenes w/ the "Slow mo" drug. lol Fun fact about that scene, the song while she's falling was inspired by slowing down Justin Bieber's song "Baby" by 800%. As for the whole resolution/framerate thing, I think now there is definitely more of a noticeable difference between 1080p and 4K than there was between 720p and 1080p. I played GTA V at 4K and I didn't even have to enable AA. Also looks a lot sharper and better than 1080. I'm not sure why they went from calling it 1080p to calling it 4K, when they're just referring to the horizontal resolution which technically isn't even "4K".
|
|
|
Post by itsnoot on Sept 15, 2017 12:54:04 GMT -5
24 vs 60 was only a debate because of the parity involved with 24fps film source material being played at 60fps. In that circumstance frames are dropped to find a common denominator. This caused fast paced scenes to appear choppy and the action difficult to follow. Panels capable of displaying 24fps were very desirable and carried a premium a few years ago. No idea about today but everything new is digitally sourced so its less of an issue Id think. Bottom line: The action scenes at 24fps source become smooth and easy to follow played native even at that low a refresh rate. The 720 vs 1080 difference does become negligible at a given distance and screen size. There's a graph floating on the web that shows it. It's like a 50ish inch panel at 10ish feet. Personally I think that's only raw resolution and doesn't take things like aliasing or super fine background details into account. The same logic should apply to 4K but the difference there is obvious at longer distances than 10 feet. I remember that. It (the feature on tv sets) was called "24p" The tech advances so rapidly, I wouldn't be surprised if 24p has already become obsolete and they're now using computer processing & algorithms to display the 24fps (film source) smoothly on 60/120/240Hz TVs. I didn't even realize until I was reading your post that 120 and 240 are common denominator, haha. I haven't seen a panel less than 120Hz in a loooong time. I bet the 24p desirability dropped off when 60 Hz panels did.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 14:08:12 GMT -5
"they couldn't tell a difference between 720p and 1080p" I remember a Best Buy employee telling me that before and thinking to myself how untrue it was. Well, they could be right in extreme instances... such as, if you're gaming on a very small 20" TV and distance between your eyes and the screen was like 10ft (or greater). lol I always thought the difference between 720p and 1080p was significant and easily noticeable, and a huge upgrade in image quality. "or between 24fps and 60fps" Yeah, that's bologna (if you're normal)... It's possible the people who've made this claim have something wrong with the visual processing center in their brains. Peoples' brain activity cycles at different speeds, from one person to another, for biological reasons (genetics, disease/brain injury, age, etc.). Perhaps it's akin to color blindness (from their point of view, there really is no difference between reds and greens)... but in this case, some peoples' visual systems are incapable of working fast enough to detect a difference between 24fps and 60fps. This topic reminds of the movie "Dredd" and scenes w/ the "Slow mo" drug. lol Fun fact about that scene, the song while she's falling was inspired by slowing down Justin Bieber's song "Baby" by 800%. As for the whole resolution/framerate thing, I think now there is definitely more of a noticeable difference between 1080p and 4K than there was between 720p and 1080p. I played GTA V at 4K and I didn't even have to enable AA. Also looks a lot sharper and better than 1080. I'm not sure why they went from calling it 1080p to calling it 4K, when they're just referring to the horizontal resolution which technically isn't even "4K". I'd wager that the term "4K" was invented by marketing people. I'm sure they figure it's a simpler term that we consumer peons can easily remember and get hyped up about.
|
|
|
Post by Coolverine on Sept 15, 2017 14:27:22 GMT -5
They could've just called it "2160p" and people would've still understood that it's twice as much as 1080p.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 14:31:26 GMT -5
I remember that. It (the feature on tv sets) was called "24p" The tech advances so rapidly, I wouldn't be surprised if 24p has already become obsolete and they're now using computer processing & algorithms to display the 24fps (film source) smoothly on 60/120/240Hz TVs. I didn't even realize until I was reading your post that 120 and 240 are common denominator, haha. I haven't seen a panel less than 120Hz in a loooong time. I bet the 24p desirability dropped off when 60 Hz panels did. Haven't thought about it before either... They're multiples of 60, so I think that means 60 can be a common denominator of all three. 60 = 3600/60 120 = 7200/60 240 = 14400/60 When you factor it down, the lowest common denominators (ones that are prime numbers) are 2, 3, and 5. 60 = 120/2 = 180/3 = 300/5 120 = 240/2 = 360/3 = 600/5 240 = 480/2 = 720/3 = 1200/5
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 14:43:05 GMT -5
They could've just called it "2160p" and people would've still understood that it's twice as much as 1080p. You're right, they could have. But they didn't. Maybe my last wager (guess, for why they went with "4K") was wrong. Rather, it could be the marketing goons gurus simply liked the sound of "4K" and rolled with it. Who knows really? The people who decided it are corporate elites and we're just average joe's.
|
|
|
Post by Babel-17 on Sept 15, 2017 15:15:45 GMT -5
"they couldn't tell a difference between 720p and 1080p" I remember a Best Buy employee telling me that before and thinking to myself how untrue it was. Well, they could be right in extreme instances... such as, if you're gaming on a very small 20" TV and distance between your eyes and the screen was like 10ft (or greater). lol I always thought the difference between 720p and 1080p was significant and easily noticeable, and a huge upgrade in image quality. "or between 24fps and 60fps" Yeah, that's bologna (if you're normal)... It's possible the people who've made this claim have something wrong with the visual processing center in their brains. Peoples' brain activity cycles at different speeds, from one person to another, for biological reasons (genetics, disease/brain injury, age, etc.). Perhaps it's akin to color blindness (from their point of view, there really is no difference between reds and greens)... but in this case, some peoples' visual systems are incapable of working fast enough to detect a difference between 24fps and 60fps. This topic reminds of the movie "Dredd" and scenes w/ the "Slow mo" drug. lol Fun fact about that scene, the song while she's falling was inspired by slowing down Justin Bieber's song "Baby" by 800%. As for the whole resolution/framerate thing, I think now there is definitely more of a noticeable difference between 1080p and 4K than there was between 720p and 1080p. I played GTA V at 4K and I didn't even have to enable AA. Also looks a lot sharper and better than 1080. I'm not sure why they went from calling it 1080p to calling it 4K, when they're just referring to the horizontal resolution which technically isn't even "4K". 2160p is 4 times the pixels of 1080p, 1080p is 2.25 times the pixels of 720p (Both 1080p and 720p are considered HD). IIRC calling 1920 x 1080 2k, and 3840 x 2160 4k, was just due to a bit of laziness. Those resolutions were just the TV versions of the ones professional video people use. Fake edit: I searched for the answer. www.extremetech.com/extreme/174221-no-tv-makers-4k-and-uhd-are-not-the-same-thing
|
|
|
Post by Coolverine on Sept 15, 2017 15:25:26 GMT -5
I was thinking they meant the horizontal resolution being 3,840 pixels, close to 4,000. Makes more sense the way you explained it. Does that mean the 2560x1440 and 2560x1600 monitors still have exponentially more pixel count compared to 1920x1080?
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 15:57:43 GMT -5
Fun fact about that scene, the song while she's falling was inspired by slowing down Justin Bieber's song "Baby" by 800%. As for the whole resolution/framerate thing, I think now there is definitely more of a noticeable difference between 1080p and 4K than there was between 720p and 1080p. I played GTA V at 4K and I didn't even have to enable AA. Also looks a lot sharper and better than 1080. I'm not sure why they went from calling it 1080p to calling it 4K, when they're just referring to the horizontal resolution which technically isn't even "4K". 2160p is 4 times the pixels of 1080p, 1080p is 2.25 times the pixels of 720p (Both 1080p and 720p are considered HD). IIRC calling 1920 x 1080 2k, and 3840 x 2160 4k, was just due to a bit of laziness. Those resolutions were just the TV versions of the ones professional video people use. Fake edit: I searched for the answer. www.extremetech.com/extreme/174221-no-tv-makers-4k-and-uhd-are-not-the-same-thingRegardless of the technical differences, they're still slapping "4K" on the boxes of TV's that are only UHD. Read just a bit further down in your link... Laziness? How much harder would it be to use the correct label of UHD, than a technically false label of 4K? It's 1 character difference for goodness sake.
|
|
|
Post by Babel-17 on Sept 15, 2017 16:34:59 GMT -5
I was thinking they meant the horizontal resolution being 3,840 pixels, close to 4,000. Makes more sense the way you explained it. Does that mean the 2560x1440 and 2560x1600 monitors still have exponentially more pixel count compared to 1920x1080? Yeah, 2560x1440 has four times the pixels as 1280x720. It's a sweet gaming resolution for titles that need a good amount of "oomph". If the game has decent post processing AA (not too blurry) then it's pretty jaggy free for me on Sony 43" XBR800D. It's native resolution is 3840x2160 but it has an excellent scaler. Compared to 1920x1080, 2560x1440 has about 77% more pixels. Compared to 2560x1440, 3840x2160 has 225% more pixels, a bridge way too far for my GTX 1070 with newer AAA titles. Off Topic but I'm amazed at how well optimized the F.E.A.R. series is. I finally bought it, very much discounted in price, and being almost too scared a few times aside, I can play it at 3840x2160 with everything maxed (including 4x AA), and MFAA, along with alpha texturing supersampling, enabled in the nVidia Control Panel. GPU usage averages well under 50%, even as the card throttles back its clock speed, which is normal under low loads. I measured using Afterburner. It took me a while to find out the solution for my framerate plummeting for no good reason. The cure was to just disable HID-compliant devices in Device Manager. It got to be a fairly well know bug, and fix. Very nice looking game overall due to way lighting and other tricks were implemented. Lot of fun trying to outwit the AI.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 16:45:39 GMT -5
I was thinking they meant the horizontal resolution being 3,840 pixels, close to 4,000. Makes more sense the way you explained it. Does that mean the 2560x1440 and 2560x1600 monitors still have exponentially more pixel count compared to 1920x1080? If the pixels on a monitor form a grid (and I'm fairly certain they do), to calculate total pixel count, all you'd need to do is multiply horizontal pixels by vertical pixels. 2560x1440 = 3,686,400 1920x1080 = 2,073,600 A simple example would be to think of the game of tic tac toe. It's a 3x3 grid. 3x3 = 9 squares on tic tac toe board
|
|
|
Post by Babel-17 on Sept 15, 2017 16:45:52 GMT -5
2160p is 4 times the pixels of 1080p, 1080p is 2.25 times the pixels of 720p (Both 1080p and 720p are considered HD). IIRC calling 1920 x 1080 2k, and 3840 x 2160 4k, was just due to a bit of laziness. Those resolutions were just the TV versions of the ones professional video people use. Fake edit: I searched for the answer. www.extremetech.com/extreme/174221-no-tv-makers-4k-and-uhd-are-not-the-same-thingRegardless of the technical differences, they're still slapping "4K" on the boxes of TV's that are only UHD. Read just a bit further down in your link... Laziness? How much harder would it be to use the correct label of UHD, than a technically false label of 4K? It's 1 character difference for goodness sake. Sure, but movie theaters call the UHD theaters "4K", and so it's not very unreasonable for TV manufacturers not to want to quibble about a difference very few people know about, and even fewer care about. You see The Martian in a 4K theater, now you want the 4K Blu ray, www.blu-ray.com/movies/The-Martian-4K-Blu-ray/147430/, so it's not surprising the UHD TV to show that is also called 4K. So I guess it's about the consumer, and not about marketing refusing to make an effort to be accurate. It's related to why we say "LED TV" when it's still an LCD, but using an LED backlight instead of a CFL one. The consumer has heard about "LED TVs", and wants one, just like they want to watch the movie they saw at the theater in 4K on it.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 16:54:30 GMT -5
I was thinking they meant the horizontal resolution being 3,840 pixels, close to 4,000. Makes more sense the way you explained it. Does that mean the 2560x1440 and 2560x1600 monitors still have exponentially more pixel count compared to 1920x1080? Yeah, 2560x1440 has four times the pixels as 1280x720. It's a sweet gaming resolution for titles that need a good amount of "oomph". If the game has decent post processing AA (not too blurry) then it's pretty jaggy free for me on Sony 43" XBR800D. It's native resolution is 3840x2160 but it has an excellent scaler. Compared to 1920x1080, 2560x1440 has about 77% more pixels. Compared to 2560x1440, 3840x2160 has 225% more pixels, a bridge way too far for my GTX 1070 with newer AAA titles. Off Topic but I'm amazed at how well optimized the F.E.A.R. series is. I finally bought it, very much discounted in price, and being almost too scared a few times aside, I can play it at 3840x2160 with everything maxed (including 4x AA), and MFAA, along with alpha texturing supersampling, enabled in the nVidia Control Panel. GPU usage averages well under 50%, even as the card throttles back its clock speed, which is normal under low loads. I measured using Afterburner. It took me a while to find out the solution for my framerate plummeting for no good reason. The cure was to just disable HID-compliant devices in Device Manager. It got to be a fairly well know bug, and fix. Very nice looking game overall due to way lighting and other tricks were implemented. Lot of fun trying to outwit the AI. 2560x1440 = 3,686,400 1920x1080 = 2,073,600 (3,686,400 - 2,073,600) / 2,073,600 = 0.77777777777777778 Yeah, looks right.
|
|
|
Post by Babel-17 on Sept 15, 2017 16:59:46 GMT -5
Yeah, 2560x1440 has four times the pixels as 1280x720. It's a sweet gaming resolution for titles that need a good amount of "oomph". If the game has decent post processing AA (not too blurry) then it's pretty jaggy free for me on Sony 43" XBR800D. It's native resolution is 3840x2160 but it has an excellent scaler. Compared to 1920x1080, 2560x1440 has about 77% more pixels. Compared to 2560x1440, 3840x2160 has 225% more pixels, a bridge way too far for my GTX 1070 with newer AAA titles. Off Topic but I'm amazed at how well optimized the F.E.A.R. series is. I finally bought it, very much discounted in price, and being almost too scared a few times aside, I can play it at 3840x2160 with everything maxed (including 4x AA), and MFAA, along with alpha texturing supersampling, enabled in the nVidia Control Panel. GPU usage averages well under 50%, even as the card throttles back its clock speed, which is normal under low loads. I measured using Afterburner. It took me a while to find out the solution for my framerate plummeting for no good reason. The cure was to just disable HID-compliant devices in Device Manager. It got to be a fairly well know bug, and fix. Very nice looking game overall due to way lighting and other tricks were implemented. Lot of fun trying to outwit the AI. 2560x1440 = 3,686,400 1920x1080 = 2,073,600 (3,686,400 - 2,073,600) / 2,073,600 = 0.77777777777777778 Yeah, looks right. I too have to do that sometimes, it's like an itch to crunch the numbers. nVidia lets you add custom resolutions and so I broke out Calculator so as to add some between the common resolutions, but making sure they stuck to being 16:9. 3200x1800 was one.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 17:06:18 GMT -5
2560x1440 = 3,686,400 1920x1080 = 2,073,600 (3,686,400 - 2,073,600) / 2,073,600 = 0.77777777777777778 Yeah, looks right. I too have to do that sometimes, it's like an itch to crunch the numbers. nVidia lets you add custom resolutions and so I broke out Calculator so as to add some between the common resolutions, but making sure they stuck to being 16:9. 3200x1800 was one. size43.com/jqueryVideoTool.htmlpacoup.com/2011/06/12/list-of-true-169-resolutions/
|
|
|
Post by itsnoot on Sept 15, 2017 17:18:32 GMT -5
Found this video showing Destiny 2 running on 560 Ti. Reading the comments looks like the 2GB version might be the saving grace here. I remember when I bought mine the talk at the time was the 2GB was a waste and the GPU didn't have the power to drive it... I bought it anyway lolol.
A friend is going to inherit my 560 Ti though, I bought a 1060 3GB for the second box this morning. I practically stole it on an apparent Fry's pricing error because I snaggled the EVGA SSC model for the low dough price of $219. HI-OH! Fry's has corrected their pricing to $270 today so don't bother. I had been poking around seeing what was out there when and I saw that pop I jumped. Too good to pass up, I can Ebay it if nothing else.
Side note: Miners sucking up cards are really driving me crazy. Cards prices are ridiculous right now.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 19:31:49 GMT -5
|
|
|
Post by Coolverine on Sept 15, 2017 20:30:56 GMT -5
I've seen even 2560x1440 and 2560x1600 being called "4K" also.
I was thinking about getting a new monitor with either of those resolutions, I've heard 2560x1440 is the sweet spot for the GTX 1070. I kinda want a 16:10 monitor again but I think 16:9 is better for FPS's and action type games. Right now I have dual monitors, a 23" and 25" both are 1920x1080. I might also just get another 23" 1080p monitor and put that to the side with the 25" in the middle. Been wanting to try Nvidia's vision surround, tried it just on 2 monitors but it puts the middle of the action right between em which is no good.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 15, 2017 21:30:31 GMT -5
|
|
|
Post by itsnoot on Sept 15, 2017 21:48:28 GMT -5
I have the U2415 and it's a beautiful display. I wish I could swing 2 more.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 20, 2017 11:13:01 GMT -5
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 20, 2017 11:59:16 GMT -5
I have the U2415 and it's a beautiful display. I wish I could swing 2 more. Current price of $229 on Amazon is a good sale price ($170 discount from the retail price).
|
|
|
Post by Coolverine on Sept 22, 2017 11:46:44 GMT -5
I might end up buying some new speakers, my 12 year old Logitech Z-5500's have been having issues. Can't figure out if it's the soundcard or the speakers themselves, sometimes the rear channels start popping and then the sound just cuts out completely to all speakers. Sometimes I can still barely hear sound but very quiet and distorted. I tried connecting them to my onboard audio a while back with the soundcard disabled and the sound was still cutting out, so I'm leaning towards it being the speakers, although Windows 10 has had lots of problems with audio in the past.
Also once when the problem started, I tested the channels on the speakers while the PC was restarting and the sound was still not working. Another reason I'm leaning more towards it being the speakers. Been looking at the Logitech Z906's, they're $259 on Amazon (retail price is $399), and there's local places here that pricematch with online retailers. Still, I'd rather get more use out of these Z-5500's, they still sound great and are in great shape. Might be the receiver gets too hot, I pointed a fan at the back of it but it didn't seem to do anything. Since this thing's out of warranty I might just open it up and see if anything looks burnt.
I had both analog and optical connected to it from my PC for troubleshooting, noticed that analog works more reliably than optical. But if I switch between them, it often causes the sound to cut out the same way. Right now I have only analog connected, I doubt it's the optical cable because it's brand new. Optical was working fine for a month after I first encountered the problems, all I did was change the fuse in the subwoofer, reinstall soundcard (Sound Blaster Z) drivers and it started working again.
Then just a few days ago, the problem came back. It's been working 100% on analog since last night, been on that whole time. I reinstalled soundcard drivers again and also updated my videocard drivers. I'm not even gonna try using optical, every time I've tried it causes the sound to mess up again.
|
|
|
Post by Emig5m on Oct 1, 2017 5:47:13 GMT -5
I might end up buying some new speakers, my 12 year old Logitech Z-5500's have been having issues. Can't figure out if it's the soundcard or the speakers themselves, sometimes the rear channels start popping and then the sound just cuts out completely to all speakers. Sometimes I can still barely hear sound but very quiet and distorted. I tried connecting them to my onboard audio a while back with the soundcard disabled and the sound was still cutting out, so I'm leaning towards it being the speakers, although Windows 10 has had lots of problems with audio in the past. Also once when the problem started, I tested the channels on the speakers while the PC was restarting and the sound was still not working. Another reason I'm leaning more towards it being the speakers. Been looking at the Logitech Z906's, they're $259 on Amazon (retail price is $399), and there's local places here that pricematch with online retailers. Still, I'd rather get more use out of these Z-5500's, they still sound great and are in great shape. Might be the receiver gets too hot, I pointed a fan at the back of it but it didn't seem to do anything. Since this thing's out of warranty I might just open it up and see if anything looks burnt. I had both analog and optical connected to it from my PC for troubleshooting, noticed that analog works more reliably than optical. But if I switch between them, it often causes the sound to cut out the same way. Right now I have only analog connected, I doubt it's the optical cable because it's brand new. Optical was working fine for a month after I first encountered the problems, all I did was change the fuse in the subwoofer, reinstall soundcard (Sound Blaster Z) drivers and it started working again. Then just a few days ago, the problem came back. It's been working 100% on analog since last night, been on that whole time. I reinstalled soundcard drivers again and also updated my videocard drivers. I'm not even gonna try using optical, every time I've tried it causes the sound to mess up again. In that price range you're a million times better off with a pair of headphones....
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 1, 2017 12:06:28 GMT -5
I might end up buying some new speakers, my 12 year old Logitech Z-5500's have been having issues. Can't figure out if it's the soundcard or the speakers themselves, sometimes the rear channels start popping and then the sound just cuts out completely to all speakers. Sometimes I can still barely hear sound but very quiet and distorted. I tried connecting them to my onboard audio a while back with the soundcard disabled and the sound was still cutting out, so I'm leaning towards it being the speakers, although Windows 10 has had lots of problems with audio in the past. Also once when the problem started, I tested the channels on the speakers while the PC was restarting and the sound was still not working. Another reason I'm leaning more towards it being the speakers. Been looking at the Logitech Z906's, they're $259 on Amazon (retail price is $399), and there's local places here that pricematch with online retailers. Still, I'd rather get more use out of these Z-5500's, they still sound great and are in great shape. Might be the receiver gets too hot, I pointed a fan at the back of it but it didn't seem to do anything. Since this thing's out of warranty I might just open it up and see if anything looks burnt. I had both analog and optical connected to it from my PC for troubleshooting, noticed that analog works more reliably than optical. But if I switch between them, it often causes the sound to cut out the same way. Right now I have only analog connected, I doubt it's the optical cable because it's brand new. Optical was working fine for a month after I first encountered the problems, all I did was change the fuse in the subwoofer, reinstall soundcard (Sound Blaster Z) drivers and it started working again. Then just a few days ago, the problem came back. It's been working 100% on analog since last night, been on that whole time. I reinstalled soundcard drivers again and also updated my videocard drivers. I'm not even gonna try using optical, every time I've tried it causes the sound to mess up again. In that price range you're a million times better off with a pair of headphones.... www.bestbuy.com/site/bose-quietcomfort-35-wireless-headphones-ii-black/5876115.p?skuId=5876115
|
|
|
Post by ForRealTho on Oct 1, 2017 12:10:32 GMT -5
|
|
|
Post by maniac on Oct 1, 2017 12:48:43 GMT -5
I've got a pair of 598's, they're awesome even though my girlfriend always thinks I'm listening to things too loud.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 1, 2017 13:17:59 GMT -5
I'm sure the wired Sennheiser have outstanding sound quality. The software and wireless features make the Bose QC 35 2 headphones so convenient though. You can pair it to wireless bluetooth devices. If you have a video streaming device, such as Roku, you can download the (Roku) app to your phone, stream sound to these headphones, and listen to movies from your couch without a cord. Google assist (voice command) allows you to receive and send text messages (by pairing it to your phone) by voice, eliminates the need for typing messages, etc., etc.
|
|