Overview
About This Club
- What's new in this club
-
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
Currently don't have it on hand, but don't want to bother you further so it is fine thanks. Will let you know if I need any further assistance though 🙂 -
GPU red light/crashes
011010010110 replied to Strider20A's topic in Computers's Build & Technical Help
No worries, Do you want to run a gpu benchmark, I can authenticate my steam account if you send me the QR code. we just need to coordinate at time. In most cases pci3 is sufficient. this one being TI overclocked could be a bottleneck. Im in sydney time and 8am till 12pm. -
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
Hey so I went to Bios and changed the Pcie slot speed to gen 3 from auto and this seems to have fixed the issue! I am guessing the riser cable is a gen 3 then, and if so would it be worthwhile to go to a gen 4 or not really? Either way thanks for the assistance, I really appreciate it! 🙂 -
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
But I am guessing that maybe there is something wrong with the riser cable since under load the bus interface is x16 @ 2.0 instead of 4.0? -
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
Would the riser cable lead to the lower power consumption for the GPU though? These should be easier to see, this is shortly after booting the system. Based off of these is there any indication that maybe the PSU got damaged somehow and now isn't delivering the power it should? Or is it more likely a GPU failure? -
GPU red light/crashes
011010010110 replied to Strider20A's topic in Computers's Build & Technical Help
The thing that can catch us out is the riser cable. I have had this issue with the BBC running a 4060, At first everything was ok and then it started to glitch out. Main issue with riser cables is that they allow interference on high speed data transfer witch then causes crashes and nasty lag, The way we validated this was to put the GPU in a separate computer, everything worked, then added the riser card and boom. The Riser card should be 4.0 standard, I have checked Azzar site but it doesn't spec either. I have had PC's run on 3.0 standard before but it may have been under favorable conditions. I can't really see the large screen diagnostics you posted. -
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
Based off these clear that GPU not getting enough power, saw around max 110w consumption but typically fluctuates between 60 - 90w with spikes in utilisation ranging between 17 -100%. -
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
I plugged the HDMI into the motherboard and I am now getting and image, and the lag/stuttering is completely gone compared to when plugged into the GPU -
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
Also looks like none of my DisplayPorts and HDMI ports are working, the PC will now also just shut off after a minute or two Also forgot to mention that earlier when I used the Memory Diagnostic Tool it did detect memory issues as well -
In the market for new gaming laptop, looking for advice.
Merzal-1414 replied to BeachDXD's topic in Computers's General
Hey, I would wait until the 50 series laptops are in full stock. I dug up some details about them and decided to write an article about it. Here you go! -
Best Gaming Laptops of 2025: A Comparison of RTX 5090 & 5080 Models
Merzal-1414 posted a blog entry in Computers's Articles
In 2025, the gaming laptop market has been invigorated by the introduction of NVIDIA's RTX 50-series GPUs, notably the RTX 5080 and RTX 5090. These GPUs, based on the Blackwell architecture, promise significant performance enhancements over their predecessors. This article delves into some of the top gaming laptops equipped with these cutting-edge GPUs, offering insights into their specifications and what sets them apart. There are lots of variations of each laptop and most have AMD and Intel variants. MSI Titan 18 HX AI Starting Price: ~$5,000 Image is of 2024 laptop but is a good indicator of how the 2025 version will look. MSI's Titan series has long been synonymous with high-end gaming performance, and the Titan 18 HX AI continues this tradition. Key Features: 18-inch Mini LED 4K display with 120Hz refresh rate Intel Core Ultra 9 275HX + RTX 5090 GPU Supports up to 96GB DDR5 RAM Advanced cooling system with dedicated heat pipes Customizable RGB lighting, including an illuminated touchpad MSI has packed cutting-edge performance into a sleek, futuristic design. If you're looking for the best of the best, the Titan 18 HX AI is a beast for gaming, content creation, and AI-driven applications. Asus ROG Strix Scar 18 (2025 Edition) Estimated Price: ~$4,500 Image is of 2024 laptop but is a good indicator of how the 2025 version will look. The Asus ROG Strix Scar 18 remains one of the best laptops for competitive gaming. Key Features: 18-inch QHD+ display with 240Hz refresh rate NVIDIA RTX 5090 GPU for ultra-smooth gaming Liquid metal cooling for better thermals RGB customization and stylish cyberpunk aesthetics High-speed PCIe Gen5 SSD for ultra-fast loading times If you’re into eSports, FPS gaming, or AAA titles, this laptop will dominate any game you throw at it. Learn more Lenovo Legion Pro 7i Gen 9 Estimated Price: ~$3,800 Image is of 2024 laptop but is a good indicator of how the 2025 version will look. Lenovo's Legion series is known for its balance between performance and value, and the Legion Pro 7i Gen 9 is a solid choice. Key Features: 16-inch Mini LED display (165Hz refresh rate) Intel Core i9-14900HX + RTX 5090 GPU Supports up to 64GB DDR5 RAM AI-powered cooling system to prevent overheating Sleek, professional design for work and gaming If you need a high-performance gaming laptop that can also be used for content creation, this is a great choice. Dell Alienware m18 R2 Estimated Price: ~$4,000 Image is of 2024 laptop but is a good indicator of how the 2025 version will look. Alienware is synonymous with premium gaming, and the m18 R2 brings flagship-level power with its RTX 5080 GPU. Key Features: 18-inch QHD+ display (165Hz refresh rate) NVIDIA RTX 5080 GPU (high-end performance) Choice between Intel & AMD processors Advanced Cryo-Tech cooling system Signature AlienFX RGB lighting If you want a powerful gaming laptop with Alienware aesthetics, the m18 R2 is a must-have. 5. The Asus ROG Zephyrus G14 is a compact yet powerful gaming laptop, ideal for those who need high-end performance in a portable form factor. Key Features: 14-inch Mini LED display with 165Hz refresh rate AMD Ryzen 9 7945HX + NVIDIA RTX 5080 GPU Supports up to 32GB DDR5 RAM Intelligent cooling with vapor chamber technology Sleek, lightweight design for portability For gamers and content creators who value mobility without compromising power, the Zephyrus G14 is a top choice. Learn more: https://rog.asus.com/laptops/rog-zephyrus/rog-zephyrus-g14-2025/ My personal preference? I like theThe Asus ROG Zephyrus G14, not only is the price usually a middle point between the Lenovo and MSI counterpart, I believe in the Republic of Gamers brand to understand what gamers want, especially with their hand held device range, they know what they are doing when it comes to compact computers optimised for gaming. This laptop features an AMD processor, is small enough to be lightweight and easy to carry, yet it's still a powerhouse! -
GPU red light/crashes
Strider20A replied to Strider20A's topic in Computers's Build & Technical Help
Hey, I have tried that and still getting the red light. The red light does go off when powering the system on though so I am not sure the light just indicates it is getting power? Should I try and use something like OCCT and HWMonitor to check the power delivery? -
In the market for new gaming laptop, looking for advice.
011010010110 replied to BeachDXD's topic in Computers's General
When you have a gaming laptop do you move it around a lot. Why are you not a fan of what you have at the moment? -
ARM vs. x86: The Future of Computing
011010010110 commented on Merzal-1414's blog entry in Computers's Articles
That is a very interesting topic, what would really cause people to consider ARM for gaming is when they can cut 256+ arm cores onto a single chip. just raw threading capability or at the least create a bridge capable of joining multiple chips/ram units into a cluster that appears as a single machine like servers host multiple cpu's -
GPU red light/crashes
011010010110 replied to Strider20A's topic in Computers's Build & Technical Help
There is no way for the GPU to detect that the power supply is inadequate at startup, there is no communication between them. The way they monitor this is to test that both cables are plugged into the sockets provided when we used to have 2 sockets on GPU instead of the new connector. This could indicate a cable has not seated correctly or shaken out of contact during flight. You may not believe it but I have seen first hand parts unplug after a long flight. However at this point I am guessing you have tried to reseat the cables for GPU. One thing that could be happening in a power supply is that 2 sockets are bonded together and then another 2 sockets are bonded together on a separate circuit, its far fetched but you could try moving the one of the gpu cables across to a spare gpu socket. -
011010010110 joined the club
-
BeachDXD joined the club
-
Hi so when starting my PC up I have a solid red light at the GPU. And then when in the desktop making mouse movements and opening programs leads to random GPU spikes as well as audio stutters and mouse stutters. Whenever loading GPU intensive tasks such as games the PC will crash and restart and via latencyMon I am also getting very high latency. I have tried a full PC reset, uninstalling drivers with DDU, BIOS update...
-
Strider20A joined the club
-
NVIDIA 50 Series vs. 40 Series: Is the Upgrade Worth It?
Merzal-1414 posted a blog entry in Computers's Articles
The launch of NVIDIA’s 50 series GPUs has sparked debates among gamers and tech enthusiasts. Many are questioning whether the latest generation offers a significant leap forward or just a minor iteration over the 40 series. The consensus among early adopters and benchmarks suggests that if you ignore frame generation technology, the raw performance gains might not be as groundbreaking as some had hoped. Raw Performance: A Modest Bump? Traditionally, each new NVIDIA GPU generation brings substantial improvements in power, efficiency, and architecture. However, initial comparisons show that the 50 series does not drastically outpace the 40 series in traditional rasterization performance. Benchmarks indicate that in games without DLSS 4’s Multi Frame Generation, the 50 series cards deliver only around 15-33% higher FPS than their direct 40 series predecessors. reddit.com While this is an improvement, it is far from the generational leaps seen in previous transitions, such as from the 30 series to the 40 series, where Ada Lovelace’s efficiency and architectural gains were much more pronounced. Ray Tracing Performance: Incremental Gains Ray tracing has been a focal point of NVIDIA’s GPU advancements, and while the 50 series does bring enhancements, they are not as revolutionary as one might expect. Without Multi Frame Generation, the performance delta remains relatively small, hovering around a 15% improvement in most ray-traced titles. The improved tensor cores and RT cores in the 50 series make ray-traced rendering slightly more efficient, but the leap is nowhere near what was seen when the 40 series first debuted. Frame Generation: The Game Changer? Much of the performance hype surrounding the 50 series revolves around DLSS 4’s Multi Frame Generation technology. This feature artificially increases FPS by inserting AI-generated frames between real frames, significantly boosting smoothness and responsiveness. For games that support Multi Frame Generation, the perceived performance boost is massive, with some titles seeing up to an 8X increase in frame rate compared to traditional rendering methods. nvidia.com However, the catch is that Multi Frame Generation does not contribute to raw rendering power—it simply increases perceived fluidity. For purists who rely on raw GPU horsepower without AI intervention, this can be a disappointing reality. Power Efficiency: A Small Step Forward One notable improvement in the 50 series is power efficiency. NVIDIA’s latest architecture provides better performance-per-watt, meaning that despite relatively modest raw FPS improvements, the 50 series operates at lower power consumption compared to equivalent 40 series GPUs. This could result in cooler, quieter systems with lower energy bills, but whether that alone justifies an upgrade is debatable. VRAM & Future-Proofing: Worth Considering? A key argument in favor of upgrading to the 50 series is VRAM capacity. Many 40 series cards suffered from limited VRAM, particularly models like the RTX 4060 Ti with only 8GB, which struggled in modern high-resolution gaming. The 50 series increases VRAM across the lineup, making it a better long-term investment for future titles that demand more memory. Should You Upgrade? Whether or not upgrading to the 50 series is worth it depends on your use case: If you are already using a high-end 40 series GPU (RTX 4080, 4090): The upgrade might not be worth it unless you rely heavily on Multi Frame Generation. If you are on an older 30 series or lower-tier 40 series card: The 50 series might provide a worthwhile boost, especially with better VRAM and efficiency. If you care about raw rasterization and ignore Frame Generation: The performance increase is modest, and it might not feel like a major leap. If you play games that support Frame Generation: The experience will feel significantly smoother, making the upgrade much more enticing. Conclusion: Evolution, Not Revolution The NVIDIA 50 series is not a groundbreaking leap forward in terms of raw performance. If you strip away DLSS and Frame Generation, the difference between the 40 and 50 series is relatively minor. However, for gamers who embrace AI-driven enhancements, Multi Frame Generation makes the 50 series feel like a much bigger upgrade than it actually is in raw specs. Ultimately, the decision to upgrade boils down to how much you value AI-enhanced gaming vs. traditional rasterized performance. If you’re in the market for a new GPU, you’ll need to weigh these factors carefully before deciding if the 50 series is worth the investment. -
For decades, x86 has dominated the world of personal computing, powering everything from desktop PCs to high-performance servers. However, in recent years, ARM architecture has been making significant strides, particularly in mobile devices, tablets, and now even laptops and servers. With Apple’s transition to ARM-based M-series chips and Microsoft’s increasing investment in ARM-powered Windows, the tech industry is at a crossroads. Is ARM the future, or will x86 continue to hold its ground? Understanding x86 and ARM Architectures Before diving into the future of computing, it's crucial to understand what differentiates x86 from ARM. x86: The Traditional Powerhouse x86 is a Complex Instruction Set Computing (CISC) architecture designed by Intel and AMD. It is optimized for high performance and flexibility, making it ideal for: High-end gaming PCs and workstations Enterprise-grade servers and cloud computing Applications requiring raw processing power, like video editing and 3D rendering However, x86 chips tend to be power-hungry and generate significant heat, making them less ideal for mobile devices and ultra-thin laptops. ARM: The Power-Efficient Contender ARM, on the other hand, is a Reduced Instruction Set Computing (RISC) architecture. Unlike x86, ARM chips prioritize power efficiency and battery life, making them dominant in: Smartphones and tablets Smart devices (IoT) Energy-efficient laptops like Apple's MacBook Air and Qualcomm-powered Windows devices ARM's modular, licensing-based business model allows companies like Apple, Qualcomm, and Nvidia to customize and optimize their own processors, leading to greater efficiency and specialization. Why ARM is Gaining Traction 1. Apple's M-Series Chips Apple’s transition from Intel x86 chips to its custom-built ARM-based M1, M2, and now M3 chips proved that ARM can compete with x86 in both performance and power efficiency. These chips: Deliver desktop-class performance with laptop-class power efficiency. Have outperformed Intel chips in many real-world applications, including video rendering and software development. Offer superior battery life, with MacBooks running up to 20 hours on a single charge. 2. Microsoft and Qualcomm’s Push for ARM Windows Historically, Windows on ARM has struggled with app compatibility and performance. However, Microsoft has made significant strides, with Qualcomm’s Snapdragon X Elite promising high-performance ARM-based Windows laptops in 2024. Key improvements include: Better x86 emulation for running legacy applications. Native ARM versions of Windows apps from major developers. Extended battery life, rivaling MacBooks. 3. Cloud Computing and ARM Servers Tech giants like Amazon (AWS Graviton), Google, and Microsoft are adopting ARM for cloud computing, benefiting from: Lower power consumption, reducing data center costs. Increased performance per watt compared to traditional x86-based servers. Customizability for specific workloads like AI and machine learning. Challenges for ARM in a Dominant x86 Market Despite ARM’s rapid growth, it still faces significant challenges: Software Compatibility: Many enterprise applications and games are still optimized for x86, requiring emulation on ARM. Industry Momentum: x86 has decades of software and hardware support, making transitions complex for businesses. High-Performance Computing (HPC): While ARM is making strides, x86 still holds the edge in raw processing power for certain workloads like high-frequency trading and advanced AI training. The Future: A Hybrid Landscape? Rather than a total displacement of x86, the future may see a hybrid computing landscape, where both architectures coexist: ARM for Consumer and Mobile Computing: With growing efficiency and performance, ARM will likely dominate ultra-portable laptops, tablets, and energy-conscious servers. x86 for High-Performance Applications: Workstations, high-end gaming PCs, and specific enterprise applications may continue relying on x86’s computational strength. More ARM-based Laptops and Desktops: As Microsoft and software developers optimize for ARM, we may see ARM-powered PCs becoming mainstream competitors to Intel and AMD. Conclusion ARM’s rise is reshaping the computing industry, challenging the decades-long dominance of x86. While x86 remains a stronghold in performance-driven markets, ARM is proving its capabilities in power efficiency, mobile computing, and even high-end performance scenarios. The coming years will determine whether x86 adapts to the power-efficient world or if ARM will ultimately take over. Regardless of the outcome, one thing is clear: the future of computing is no longer a one-horse race.
-
mwh.Tarin joined the club
-
Josue8810 joined the club
-
Kittrix#6146 joined the club
-
t1 joined the club
-
6wrobel joined the club
-
Bigfisch77 joined the club
-
NVIDIA 50 Series vs. 40 Series: Is the Upgrade Worth It? The launch of NVIDIA’s 50 series GPUs has sparked debates among gamers and tech enthusiasts. Many are questioning whether the latest generation offers a significant leap forward or just a minor iteration over the 40 series. The consensus among early adopters and benchmarks suggests that if you ignore frame generation technology, the raw performance gains might not be as groundbreaking as some had hoped. Raw Performance: A Modest Bump? Traditionally, each new NVIDIA GPU generation brings substantial improvements in power, efficiency, and architecture. However, initial comparisons show that the 50 series does not drastically outpace the 40 series in traditional rasterization performance. Benchmarks indicate that in games without DLSS 4’s Multi Frame Generation, the 50 series cards deliver only around 15-33% higher FPS than their direct 40 series predecessors. reddit.com While this is an improvement, it is far from the generational leaps seen in previous transitions, such as from the 30 series to the 40 series, where Ada Lovelace’s efficiency and architectural gains were much more pronounced. Ray Tracing Performance: Incremental Gains Ray tracing has been a focal point of NVIDIA’s GPU advancements, and while the 50 series does bring enhancements, they are not as revolutionary as one might expect. Without Multi Frame Generation, the performance delta remains relatively small, hovering around a 15% improvement in most ray-traced titles. The improved tensor cores and RT cores in the 50 series make ray-traced rendering slightly more efficient, but the leap is nowhere near what was seen when the 40 series first debuted. Frame Generation: The Game Changer? Much of the performance hype surrounding the 50 series revolves around DLSS 4’s Multi Frame Generation technology. This feature artificially increases FPS by inserting AI-generated frames between real frames, significantly boosting smoothness and responsiveness. For games that support Multi Frame Generation, the perceived performance boost is massive, with some titles seeing up to an 8X increase in frame rate compared to traditional rendering methods. nvidia.com However, the catch is that Multi Frame Generation does not contribute to raw rendering power—it simply increases perceived fluidity. For purists who rely on raw GPU horsepower without AI intervention, this can be a disappointing reality. Power Efficiency: A Small Step Forward One notable improvement in the 50 series is power efficiency. NVIDIA’s latest architecture provides better performance-per-watt, meaning that despite relatively modest raw FPS improvements, the 50 series operates at lower power consumption compared to equivalent 40 series GPUs. This could result in cooler, quieter systems with lower energy bills, but whether that alone justifies an upgrade is debatable. VRAM & Future-Proofing: Worth Considering? A key argument in favor of upgrading to the 50 series is VRAM capacity. Many 40 series cards suffered from limited VRAM, particularly models like the RTX 4060 Ti with only 8GB, which struggled in modern high-resolution gaming. The 50 series increases VRAM across the lineup, making it a better long-term investment for future titles that demand more memory. Should You Upgrade? Whether or not upgrading to the 50 series is worth it depends on your use case: If you are already using a high-end 40 series GPU (RTX 4080, 4090): The upgrade might not be worth it unless you rely heavily on Multi Frame Generation. If you are on an older 30 series or lower-tier 40 series card: The 50 series might provide a worthwhile boost, especially with better VRAM and efficiency. If you care about raw rasterization and ignore Frame Generation: The performance increase is modest, and it might not feel like a major leap. If you play games that support Frame Generation: The experience will feel significantly smoother, making the upgrade much more enticing. Conclusion: Evolution, Not Revolution The NVIDIA 50 series is not a groundbreaking leap forward in terms of raw performance. If you strip away DLSS and Frame Generation, the difference between the 40 and 50 series is relatively minor. However, for gamers who embrace AI-driven enhancements, Multi Frame Generation makes the 50 series feel like a much bigger upgrade than it actually is in raw specs. Ultimately, the decision to upgrade boils down to how much you value AI-enhanced gaming vs. traditional rasterized performance. If you’re in the market for a new GPU, you’ll need to weigh these factors carefully before deciding if the 50 series is worth the investment. Image Credits: NVIDIA, Puget Systems, HotHardware