Nvidia Mining GPU to Be Launched Sooner ... - Bitcoin News
Nvidia Mining GPU to Be Launched Sooner ... - Bitcoin News
Best GPU for Mining - Your Top 8 Choices for 2020, 2025 ...
The Best Coins to Mine in 2020 with: CPU, GPU or ASIC ...
7 Best Ethereum Mining Hardware ASICs & GPUs (2020 Comparison)
Best mining CPU 2020: the best processors for mining ...
[META] New to PC Building? - September 2018 Edition
You've heard from all your gaming friends/family or co-workers that custom PCs are the way to go. Or maybe you've been fed up with your HP, Dell, Acer, Gateway, Lenovo, etc. pre-builts or Macs and want some more quality and value in your next PC purchase. Or maybe you haven't built a PC in a long time and want to get back into the game. Well, here's a good place to start.
Make a budget for your PC (e.g., $800, $1000, $1250, $1500, etc.).
Decide what you will use your PC for.
For gaming, decide what games and at what resolution and FPS you want to play at.
For productivity, decide what software you'll need and find the recommended specs to use those apps.
For a bit of both, your PC build should be built on the HIGHEST specs recommended for your applications (e.g., if you only play FortNite and need CPU power for CFD simulations, use specs recommended for CFD).
Here are some rough estimates for builds with entirely NEW parts: 1080p 60FPS ultra-settings modern AAA gaming: ~$1,200 1440p 60FPS high/ultra-settings modern AAA gaming: ~$1,600 1080p 144FPS ultra-settings modern AAA gaming: $2,000 4K 50FPS medium/high-settings modern AAA gaming: > $2,400 It's noted that some compromises (e.g., lower settings and/or resolution) can be made to achieve the same or slightly lower gaming experience within ±15% of the above prices. It's also noted that you can still get higher FPS on older or used PCs by lowering settings and/or resolution AND/OR buying new/used parts to upgrade your system. Make a new topic about it if you're interested. Also note that AAA gaming is different from e-sport games like CSGO, DOTA2, FortNite, HOTS, LoL, Overwatch, R6S, etc. Those games have lower requirements and can make do with smaller budgets.
Revise your budget AND/OR resolution and FPS until both are compatible. Compare this to the recommended requirements of the most demanding game on your list. For older games, you might be able to lower your budget. For others, you might have to increase your budget. It helps to watch gaming benchmarks on Youtube. A good example of what you're looking for is something like this (https://www.youtube.com/watch?v=9eLxSOoSdjY). Take note of the resolution, settings, FPS, and the specs in the video title/description; ask yourself if the better gaming experience is worth increasing your budget OR if you're okay with lower settings and lowering your budget. Note that you won't be able to see FPS higher than 60FPS for Youtube videos; something like this would have to be seen in-person at a computer shop.
After procuring your parts, it's time to build. Use a good Youtube tutorial like this (https://www.youtube.com/watch?v=IhX0fOUYd8Q) that teach BAPC fundamentals, but always refer to your product manuals or other Youtube tutorials for part-specific instructions like CPU mounting, radiator mounting, CMOS resetting, etc. If it everything still seems overwhelming, you can always pay a computer shop or a friend/family member to build it for you. It might also be smart to look up some first-time building mistakes to avoid:
If you have any other questions, use the search bar first. If it's not there, make a topic.
BAPC News (Last Updated - 2018/09/20)
https://www.tomshardware.com/news/intel-9000-series-cpu-faq,37743.html Intel 9000 CPUs (Coffee Lake Refresh) will be coming out in Q4. With the exception of i9 (8-core, 12 threads) flagship CPUs, the i3, i5, and i7 lineups are almost identical to their Intel 8000 (Coffee Lake) series, but slightly clocked faster. If you are wondering if you should upgrade to the newer CPU on the same tier (e.g., i5-8400 to i5-9400), I don't recommend that you do as you will only see marginal performance increases.
https://www.youtube.com/watch?v=WDrpsv0QIR0 RTX 2080 and 2080 Ti benchmarks are out; they provide ~10 and ~20 frames better than the 1080 Ti and also feature ray tracing (superior lighting and shadow effects) which is featured in only ~30 games so far (i.e., not supported a lot); effectively, they provide +25% more performance for +70% increased cost. My recommendation is NOT to buy them unless you need it for work or have lots of disposable income. GTX 1000 Pascal series are still relevant in today's gaming specs.
The calculator part. More GHz is analogous to fast fingers number crunching in the calculator. More cores is analogous to having more calculators. More threads is analogous to having more filing clerks piling more work for the calculator to do. Microarchitectures (core design) is analogous to how the internal circuit inside the calculator is designed (e.g., AMD FX series are slower than Intel equivalents even with higher OC'd GHz speeds because the core design is subpar). All three are important in determining CPU speed. In general, higher GHz is more important for gaming now whereas # cores and threads are more important for multitasking like streaming, video editing, and advanced scientific/engineering computations. Core designs from both AMD and Intel in their most recent products are very good now, but something to keep in mind.
The basic concept of overclocking (OCing) is to feed your CPU more power through voltage and hoping it does calculations faster. Whether your parts are good overclockers depends on the manufacturing process of your specific part and slight variations in materials and manufacturing process will result in different overclocking capability ("silicon lottery"). The downside to this is that you can void your warranties because doing this will produce excess heat that will decrease the lifespan of your parts AND that there is a trial-and-error process to finding OC settings that are stable. Unstable OC settings result in computer freezes or random shut-offs from excess heat. OCing will give you extra performance often for free or by investing in a CPU cooler to control your temperatures so that the excess heat will not decrease your parts' lifespans as much. If you don't know how to OC, don't do it.
Intel CPUs have higher GHz than AMD CPUs, which make them better for gaming purposes. However, AMD Ryzen CPUs have more cores and threads than their Intel equivalents. The new parts are AMD Ryzen 3, 5, or 7 2000 series or Intel i3, i5, or i7 8000 series (Coffee Lake). Everything else is outdated. If you want to overclock on an AMD system, know that you can get some moderate OC on a B350/B450 with all CPUs. X370/X470 mobos usually come with better VRMs meant for OCing 2600X, 2700, and 2700X. If you don't know how to OC, know that the -X AMD CPUs have the ability to OC themselves automatically without manually settings. For Intel systems, you cannot OC unless the CPU is an unlocked -K chip (e.g., i3-8350K, i5-8600K, i7-8700K, etc.) AND the motherboard is a Z370 mobo. In general, it is not worth getting a Z370 mobo UNLESS you are getting an i5-8600K and i7-8700K.
CPU and Mobo Compatibility
Note about Ryzen 2000 CPUs on B350 mobos: yes, you CAN pair them up since they use the same socket. You might get an error message on PCPP that says that they might not be compatible. Call the retailer and ask if the mobo you're planning on buying has a "Ryzen 2000 Series Ready" sticker on the box. This SHOULD NOT be a problem with any mobos manufactured after February 2018. Note about Intel 9000 CPUs on B360 / Z370 mobos: same as above with Ryzen 2000 CPUs on B350 or X370 boards.
CPU Cooler (Air / Liquid)
Air or liquid cooling for your CPU. This is mostly optional unless heavy OCing on AMD Ryzen CPUs and/or on Intel -K and i7-8700 CPUs. For more information about air and liquid cooling comparisons, see here:
Part that lets all the parts talk to each other. Comes in different sizes from small to big: mITX, mATX, ATX, and eATX. For most people, mATX is cost-effective and does the job perfectly. If you need more features like extra USB slots, go for an ATX. mITX is for those who want a really small form factor and are willing to pay a premium for it. eATX mobos are like ATX mobos except that they have more features and are bigger - meant for super PC enthusiasts who need the features.
AMD Ryzen CPUs: go for X470s for Ryzen 7 and B450s for everything else. B350s will also work as a sub for B450 mobos and the same can be said for X370s for X470s, but they are being phased out and may require a BIOS update to support the Ryzen 2000 CPUs if it doesn't have a "Ryzen 2000 Series Ready" sticker on the box.
Intel Coffee Lake CPUs: go for Z370s for unlocked -K CPUs and B360s for everything else.
If you are NOT OCing, pick whatever is cheap and meets your specs. I recommend ASUS or MSI because they have RMA centres in Canada in case it breaks whereas other parts are outside of Canada like in the US. If you are OCing, then you need to look at the quality of the VRMs because those will greatly influence the stability and lifespan of your parts.
Part that keeps Windows and your software active. Currently runs on the DDR4 platform for new builds. Go for dual channel whenever possible. Here's a breakdown of how much RAM you need:
2x4GB = 8GB is the minimum recommended
2x8GB = 16GB recommended for gaming
2x16GB+ for workstations
AMD Ryzen CPUs get extra FPS for faster RAM speeds (ideally 3200MHz) in gaming when paired with powerful video cards like the GTX 1070. Intel Coffee Lake CPUs use up a max of 2667MHz for B360 mobos. Higher end Z370 mobos can support 4000 - 4333MHz RAM depending on the mobo, so make sure you shop carefully! It's noted that RAM prices are highly inflated because of the smartphone industry and possibly artificial supply shortages. For more information: https://www.extremetech.com/computing/263031-ram-prices-roof-stuck-way
Part that store your files in the form of SSDs and HDDs.
Solid State Drives (SSDs)
SSDs are incredibly quick, but are expensive per TB; they are good for booting up Windows and for reducing loading times for gaming. For an old OEM pre-built, upgrading the PC with an SSD is the single greatest speed booster you can do to your system. For most people, you want to make sure the SSD you get is NOT DRAM-less as these SSDs do not last as long as their DRAM counterparts (https://www.youtube.com/watch?v=ybIXsrLCgdM). It is also noted that the bigger the capacity of the SSD, the faster they are. SSDs come in four forms:
2.5" SATA III
M.2 NVME PCI-e
The 2.5" SATA form is cheaper, but it is the old format with speeds up to 550MB/s. M.2 SATA SSDs have the same transfer speeds as 2.5" SATA SSDs since they use the SATA interface, but connect directly to the mobo without a cable. It's better for cable management to get an M.2 SATA SSD over a 2.5" SATA III SSD. M.2 PCI-e SSDs are the newest SSD format and transfer up to 4GB/s depending on the PCI-e lanes they use (e.g., 1x, 2x, 4x, etc.). They're great for moving large files (e.g., 4K video production). For more info about U.2 drives, see this post (https://www.reddit.com/bapccanada/comments/8jxfqs/meta_new_to_pc_building_may_2018_edition/dzqj5ks/). Currently more common for enterprise builds, but could see some usage in consumer builds.
Hard Disk Drives (HDDs)
HDDs are slow with transfer speeds of ~100MB/s, but are cheap per TB compared to SSDs. We are now at SATA III speeds, which have a max theoretical transfer rate of 600MB/s. They also come in 5400RPM and 7200RPM forms. 5400RPM uses slightly less power and are cheaper, but aren't as fast at dealing with a large number of small files as 7200RPM HDDs. When dealing with a small number of large files, they have roughly equivalent performance. It is noted that even a 10,000RPM HDD will still be slower than an average 2.5" SATA III SSD.
SSHDs are hybrids of SSDs and HDDs. Although they seem like a good combination, it's much better in all cases to get a dedicated SSD and a dedicated HDD instead. This is because the $/speed better for SSDs and the $/TB is better for HDDs. The same can be said for Intel Optane. They both have their uses, but for most users, aren't worth it.
I recommend a 2.5" or M.2 SATA ≥ 250GB DRAM SSD and a 1TB or 2TB 7200RPM HDD configuration for most users for a balance of speed and storage capacity.
Part that runs complex calculations in games and outputs to your monitor and is usually the most expensive part of the budget. The GPU you pick is dictated by the gaming resolution and FPS you want to play at. In general, all video cards of the same product name have almost the same non-OC'd performance (e.g., Asus Dual-GTX1060-06G has the same performance as the EVGA 06G-P4-6163-KR SC GAMING). The different sizes and # fans DO affect GPU OCing capability, however. The most important thing here is to get an open-air video card, NOT a blower video card (https://www.youtube.com/watch?v=0domMRFG1Rw). The blower card is meant for upgrading pre-builts where case airflow is limited. For cost-performance, go for the NVIDIA GTX cards because of the cryptomining industry that has inflated AMD RX cards. Bitcoin has taken a -20% hit since January's $10,000+ as of recently, but the cryptomining industry is still ongoing. Luckily, this means prices have nearly corrected itself to original MSRP in 2016. In general:
Part that houses your parts and protects them from its environment. Should often be the last part you choose because the selection is big enough to be compatible with any build you choose as long as the case is equal to or bigger than the mobo form factor. Things to consider: aesthetics, case airflow, cable management, material, cooling options (radiators or # of fan spaces), # fans included, # drive bays, toolless installation, power supply shroud, GPU clearance length, window if applicable (e.g., acrylic, tempered glass), etc. It is recommended to watch or read case reviews on Youtube to get an idea of a case's performance in your setup.
Part that runs your PC from the wall socket. Never go with an non-reputable/cheap brand out on these parts as low-quality parts could damage your other parts. Recommended branded PSUs are Corsair, EVGA, Seasonic, and Thermaltake, generally. For a tier list, see here (https://linustechtips.com/main/topic/631048-psu-tier-list-updated/).
Wattage depends on the video card chosen, if you plan to OC, and/or if you plan to upgrade to a more powerful PSU in the future. Here's a rule of thumb for non-OC wattages that meet NVIDIA's recommendations:
1050 Ti: 300W
1060 3GB/6GB: 400W
1070 / 1070 Ti: 500W
1080 Ti: 600W
There are also PSU wattage calculators that you can use to estimate your wattage. How much wattage you used is based on your PC parts, how much OCing you're doing, your peripherals (e.g., gaming mouse and keyboard), and how long you plan to leave your computer running, etc. It is noted that these calculators use conservative estimates, so use the outputted wattage as a baseline of how much you need. Here are the calculators (thanks, VitaminDeity).
Pick ONE calculator to use and use the recommended wattage, NOT recommended product, as a baseline of what wattage you need for your build. Note that Cooler Master and Seasonic use the exact calculator as Outervision. For more details about wattage, here are some reference videos:
You might also see some info about modularity (non-modular, semi-modular, or fully-modular). These describe if the cables will come connected to the PSU or can be separated of your own choosing. Non-modular PSUs have ALL of the cable connections attached to the PSU with no option to remove unneeded cables. Semi-modular PSUs have separate cables for HDDs/SSDs and PCI-e connectors, but will have CPU and mobo cables attached. Modular PSUs have all of their cables separate from each other, allowing you to fully control over cable management. It is noted that with decent cooling and airflow in your case, cable management has little effect on your temperatures (https://www.youtube.com/watch?v=YDCMMf-_ASE).
80+ Efficiency Ratings
As for ratings (80+, 80+ bronze, 80+ gold, 80+ platinum), these are the efficiencies of your PSU. Please see here for more information. If you look purely on electricity costs, the 80+ gold PSUs will be more expensive than 80+ bronze PSUs for the average Canadian user until a breakeven point of 6 years (assuming 8 hours/day usage), but often the better performance, longer warranty periods, durable build quality, and extra features like fanless cooling is worth the extra premium. In general, the rule of thumb is 80+ bronze for entry-level office PCs and 80+ gold for mid-tier or higher gaming/workstation builds. If the price difference between a 80+ bronze PSU and 80+ gold PSU is < 20%, get the 80+ gold PSU!
Warranties should also be looked at when shopping for PSUs. In general, longer warranties also have better PSU build quality. In general, for 80+ bronze and gold PSU units from reputable brands:
These guys are engineering experts who take apart PSUs, analyze the quality of each product, and provide an evaluation of the product. Another great website is http://www.orionpsudb.com/, which shows which PSUs are manufactured by different OEMs.
Operating System (OS)
The most common OS. You can download the ISO here (https://www.microsoft.com/en-ca/software-download/windows10). For instructions on how to install the ISO from a USB drive, see here (https://docs.microsoft.com/en-us/windows-hardware/manufacture/desktop/install-windows-from-a-usb-flash-drive) or watch a video here (https://www.youtube.com/watch?v=gLfnuE1unS8). For most users, go with the 64-bit version. If you purchase a Windows 10 retail key (i.e., you buy it from a retailer or from Microsoft directly), keep in mind that you are able to transfer it between builds. So if you're building another PC for the 2nd, 3rd, etc. time, you can reuse the key for those builds PROVIDED that you deactivate your key before installing it on your new PC. These keys are ~$120. However, if you have an OEM key (e.g., pre-builts), that key is tied specifically to your mobo. If you ever decide to upgrade your mobo on that pre-built PC, you might have to buy a new Windows 10 license. For more information, see this post (https://www.techadvisor.co.uk/feature/windows/windows-10-oem-or-retail-3665849/). The cheaper Windows 10 keys you can find on Kinguin are OEM keys; activating and deactivating these keys may require phoning an automated Microsoft activation line. Most of these keys are legitimate and cost ~$35, although Microsoft does not intend for home users to obtain this version of it. Buyer beware. The last type of key is a volume licensing key. They are licensed in large volumes to corporate or commercial usage. You can find lots of these keys on eBay for ~$10, but if the IT department who manages these keys audit who is using these keys or if the number of activations have exceeded the number allotted on that one key, Microsoft could block that key and invalidate your license. Buyer beware. For more information on differentiating between all three types of keys, see this page (https://www.tenforums.com/tutorials/49586-determine-if-windows-license-type-oem-retail-volume.html). If money is tight, you can get Windows 10 from Microsoft and use a trial version of it indefinitely. However, there will be a watermark in the bottom-right of your screen until you activate your Windows key.
If you're interested in using MacOS, look into Hackintosh builds. This will allow you to run MacOS to run on PC parts, saving you lots of money. These builds are pretty picky about part compatibility, so you might run into some headaches trying to go through with this. For more information, see the following links:
Please note that the cost-performance builds will change daily because PC part prices change often! Some builds will have excellent cost-performance one day and then have terrible cost-performance the next. If you want to optimize cost-performance, it is your responsibility to do this if you go down this route! Also, DO NOT PM me with PC build requests! It is in your best interests to make your own topic so you can get multiple suggestions and input from the community rather than just my own. Thanks again.
Here are some sample builds that are reliable, but may not be cost-optimized builds. These builds were created on September 9, 2018; feel free to "edit this part list" and create your own builds.
Updated sample builds to include both AMD and Intel builds
Sorry for the lack of updates. I recently got a new job where I work 12 hours/day for 7 days at a time out of the city. What little spare time I have is spent on grad school and the gym instead of gaming. So I've been pretty behind on the news and some might not be up-to-date as my standards would have been with less commitments. If I've made any mistakes, please understand it might take a while for me to correct them. Thank you!
Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
Upgrading from an FX-8350 to a R7-1700. Just a bit about me – I have been building computers since the mid 80’s. I missed the 8-inch floppy disk era, but came on board when dual 5.25” was considered mainstream and a 10-megabyte full-height HDD was the mark of a power user. The first computer I built for my own enjoyment was an AMD X5-133 (a factory overclocked 486 faster than the Pentium-75), and I’ve used a wide variety of systems since then, including a Pentium Pro-200 which served me well in college and a K6-2 which I took to quite a few LAN parties. While I’ve always had Intel notebooks, my PC’s have been AMD for quite some time now. I decided to upgrade my current main machine, which is an FX-8350 with a mild 4.4Ghz overclock. I was using 2x8GB Crucial Ballistix DDR3-1600 and a Sapphire Radeon Fury Nitro. While I know the R5-1600x would be a better bet for a pure gaming build, I have a soft spot for 8-core machines. I had been tempted to pull the trigger on an i7-7700k for a while, but the timing never worked out. But when I found the R7-1700 at a deep discount and an X370 motherboard on the shelf next to it – I couldn’t resist the siren call of a new build. Here are my thoughts about the process: AM4 is physically the same as AM3 from a build perspective, except for the mounting holes. I don’t know what was so important about making the holes have different offsets, but this makes it much more difficult to get quality cooling. Not all manufacturers have brackets yet, and I’m still waiting on Cooler Master to release the brackets for my Siedon 240. The new motherboard feels very different from my AM3 board. My FX-8350 sat on an ASUS M5A99FX Pro R2.0. It was, for lack of a better word, a very workstation-ish board. 4 PCIx16 slots, 10x USB ports (2 of the USB 3.0), triple USB 2.0 front panel headers (and a USB 3.0 front panel header as well), eSATA on the rear panel, beefy VRM and Northbridge cooling, Toslink output for audio, and so on. The board itself is full of tiny components, support chips, and ports. Granted, many of these connectors are outdated (eSATA and USB2.0), and the PCIe is only 2.0 instead of current-gen 3.0, but there is a LOT of connectivity. Few people paired an FX chip with triple of quad-GPU for gaming, but I know a fair number of people used these for bitcoin mining back before there was widespread ASIC support and back then GPU mining was the most cost-effective way to mint cryptocurrency. Extra PCIe slots could be used for dedicated video capture, PCI-based storage, a RAID card, etc... Having 4 full-size slots allows this kind of flexibility. The new motherboard is an Asrock Fatal1ty x370 Gaming K4. It does not feel very workstation-ish at all. It has only two 16x PCIe slots (and when they are both in use they are only 8x), 8 USB ports on the rear panel, and a much less “busy” motherboard. Very few support chips litter its surface. Instead of a workstation component, it feels much more like a luxury consumer product. This is not a bad thing – just something I noticed while building the system. The rear IO shield is red and black to match its gaming aesthetic, it includes things like premium audio (including a very nice headphone amplifier for the front panel connectors), and while it only has 8x USB ports on the back, 6 of them are USB 3.0 and two of them (including a type-C connector) are USB 3.1 gen2. It includes RGB LED’s under the chipset heatsink and three separate RGB LED controller ports (one of which is used for the boxed cooler), Intel gigabit Ethernet, and dual M.2 slots (one of which connected directly to the CPU). It is very different in “feel” from the older ASUS board, even down to things like a shroud for the external connectors and metal-reinforced PCI slots. I must say, its more aggressive appearance and near-empty areas appeal to me. It does, however, funnel the builder into a particular configuration: limited fast storage through the M.2 slots, slow(er) storage through the 6x SATA ports, all external devices should be USB 3. Personally, these limitations didn’t restrict me for this build, since that was how I was going to set it up anyway, but the fewer connectivity choices might cause some pause for others. The only thing I don’t like about this board is the 20 second POST times. 20 seconds every time. Resuming from sleep is very fast, just reboots are slow. That’s really it. I have no substantive complaints other than that – well, and the memory speed limitations – more on that below. The Wraith Spire cooler is without doubt the best looking box cooler I’ve ever seen. The symmetrical cylinder look, combined with the LED logo and RGB ring are very striking. I can see why many people have asked to order one, though I think for the 1700X and 1800X they are better off without it. I’ll explain why further down. Initial hardware setup was very easy. I was able to flash to the newest 2.0 BIOS without any hassle using a DOS USB flash boot drive. The 2.0 BIOS has the newest AGESA code from AMD, as well as support for the R5 processors and better DDR4 compatibility. I didn’t want to cheap out on RAM since apparently Ryzen is sensitive to DDR4 speeds for the latency between cores. I bought the cheapest 16GB DDR4-3200 kit I could find (the EVGA SuperSC 2x8GB), for which I paid $115. While I was not able to get it to boot at 3200, I could get 2933 simply by activating XMP, then manually changing the speed from 3200 to 3000. I then tested it with MemTest86 for two complete cycles, which it passed without errors. I have encountered zero memory issues with these RAM sticks running at 2933. Since this motherboard does not officially support DDR4-3200 at all, I figure this is a good outcome. I am curious to know whether anyone has gotten 3200 on this board – that is, whether the lack of 3200 memory on Asrock’s QVL is a marketing issue or an actual hardware limitation – but I didn’t want to spend nearly double that amount in order to get AM4 verified memory (G.Skill’s FlareX), and 2966 seemed fast enough from the benchmark results I had read. My old setup had a Samsung 850 EVO 256gb SATA6 drive as the primary boot/gaming drive. It seemed plenty fast but it had become too small for my needs, so this seemed like a good opportunity to buy a new SSD. I originally thought the NVMe drives would be out of my price range, but I bought the Intel 600p 512GB drive for only $10 more than I would have paid for a premium SATA6 drive. Though the 600p is without doubt the SLOWEST NVMe drive out there, it has 3x the read speed as the SATA6 drives, and most of what I am doing with it is trying to get quicker load times. If I was using it for professional workloads (as a video editing scratch drive, for example), I would need much higher sustained write speeds and then Samsung would be the obvious answer. I just didn’t want to spend an extra $80 on write performance that I’d never notice, and the 600p has been an excellent boot/gaming drive. Ok, back to the Wraith Spire. I tend to have bad luck with the silicon lottery. My FX-8350 was not able to be stable above 4.4Ghz with reasonable temperatures. I was hoping I would be able to get better results from the R7-1700, since general reports indicated that it overclocked well. Unfortunately, it is difficult to tell how good of an overclock I am getting since I can find no good information about maximum recommended temperatures for this chip. Some people say 75c is the maximum safe temp. Others say 75c is a fine everyday 24/7 temp. Others say they are running it at 80c all the time without any issues at all. Steve at Techspot was getting 88c and 90c when overclocking the 1600X and 1500X using the stock coolers and without any instability – were those dangerous temps or totally fine? Nobody seems to know. I like my overclocks to be set-and-forget. I want to get it dialed in and then leave it for years without worrying that it will burn up or degrade or that in this or that application I have to turn back to stock speeds because of the thermals. Since I don’t know what max safe thermals are, I just have to guess based on stock thermals. For stock speeds, the Wraith Spire does a good job. It is very quiet, and after a few BIOS fan-curve tweaks, it keeps the chip around 35-38 at idle, and around 68-70 on Prime95 (Small FFT, for maximum temperature generation). Incidentally, it also hits 70 if I run Cinebench a bunch of times in a row as well, so I don’t consider the Small FFT test to be totally unrealistic for the load this chip might encounter. From what I can tell, these are good normal temps. I can get 3.5Ghz by simply changing the multiplier and leaving the voltage at stock. This gives Cinebench numbers around the 1550 mark (roughly 6900k levels). Prime95 shows a modest boost in temperatures of 3-4 degrees C, and was stable even for several hours. If I push it to 3.6Ghz at stock voltage the system is unstable. At 3.7Ghz (the 1700’s boost speed for single-threaded loads) it is stable only if I give it 1.3v. While that is a totally fine voltage (AMD recommends up to 1.35v for 24/7), the Wraith Spire cannot handle a Prime95 Small FFT load anymore. I shut down the test and reverted the OC when the CPU read 89c. Given the fact that the Spire was meant to cool a 65w chip (and so probably is rated at no more than 85-95w), this is not a terribly surprising temperature – I wish I knew if it was dangerous. I have no doubt that a 240mm radiator or even a decent tower cooler will be more than enough to cool down my 3.7Ghz R7-1700. I am a little jealous of the people who just set the multiplier to 3700 and are good to go – lower voltages probably mean the Spire would be enough. But for me, it was not to be. I was halfway tempted to see at what temperature the chip would reduce its clock speed, but I didn’t want to burn up a chip I had just bought – might as well wait until I get bigger and better cooling to OC it to the 3.8-3.9 I hope it will reach. Other than the OC temps it has been smooth sailing. Gaming feels more fluid than with the FX, even in games that I always thought were GPU-limited and/or running at 60fps with VSYNC on. Especially games that are sensitive to single-core performance (Heroes of the Storm is my latest addiction) there is a definite boost in 1% low and 0.1% low FPS. I have been using the Ryzen Balanced power plan from AMD and it seems to do a fantastic job keeping temps low when idle and letting the cores ramp up really fast when needed. I need to test whether the lack of core parking prevents it from hitting the 3.7Ghz boost as much as the regular Balanced plan allows. I think a simple CineBench single-thread comparison will do the trick. I also tried streaming a bit – and it was able to generate 1080p60fps at x264-medium settings without being noticeable while in game. Later I edited some video of my kids – the final render speed was SOOOO fast. I am, on the whole, very happy with my upgrade. I get better single-core performance, much much better multi-core performance, along with faster disk speeds, and a more modern platform (with RGB lighting, M.2, USB 3.1, etc…). Now if only I could find out appropriate temperatures…..
Please keep in mind that this will be an evolving FAQ and a living document as we progress through our testnet, so check back here often for updates.
What is the objective of the Skywire Public Testnet?
There are several goals we will accomplish with the Skywire Public Testnet. The testnet will be split into several phases. The version running today is the internal version of our testnet, aimed to validate its function and performance. The coming revision will publicize the network, as well as establish a fair economic model reward mechanism for running nodes. This will be based upon analysis of node utilization during testing. This testing will provide us valuable information to design a robust mathematical model for the mainnet so that all nodes on the mainnet will be automatically incentivized under a fair economic model.
What is the function of this version?
Once a user sets up an operational node, they will be able to search for other nodes and be connected to users around the world, breaking down borders and barriers to access global information. Note that any computer can become a node on the network, however, only whitelisted Skyminers (all Official and selected DIY) will be participating in the economic model testing program, and eligible for rewards.
What kind of Skyminers will be whitelisted for the Testnet?
There are three main categories of Skyminers:
Official Skywire Miner
DIY Equivalent Skyminers
Other nodes and hardware
The initial whitelist will include the Official Skyminers that have shipped to users around the globe. These will become the baseline for early DIY Skyminers. Since we are entering into uncharted skies, we want to initially reduce any variables possible and test the network in a controlled manner. We have already been scaling out to include high quality DIY Skyminers with equivalent specifications, and eventually any Skyminer (official or DIY) that reaches the minimum specifications required. Those minimum specifications will be determined during the testnet and released to the Skyfleet community as they become available so stay tuned.
What will be the whitelisting process be like?
First, the Skywire core team will collect the public keys for each node within each Official Skyminer. Since there are 8 nodes in a Skyminer, each will have 8 public keys. These Official Skyminers will be whitelisted once public keys are provided to the Skywire core team. Here is a link to the whitelist submission page DIY Skyminers will be reviewed manually and approved weekly (approximately 50 per week) following the completion of the whitelisting process for the Official Skyminers.
Can DIY Skyminers join the whitelist?
While Official Skyminers will be on the whitelist by default (upon submission and receipt of their public keys), DIY Skyminers will be allowed to join the whitelist based on the benchmark set by the Official Skyminer’s hardware configuration. DIY Skyminers will be required to provide detailed specifications and photos, submitted to the corresponding team for review. Qualified DIY Skyminers will be added into the testnet whitelist. Please remember that only selected DIY miners will be whitelisted. You may refer to the Skywire community on Telegram or the community Skywug forum for more discussions around this topic. The first generation Offiicial Skyminer hardware configuration is as follows: 8 hardware nodes made up of 8 Orange Pi Prime PCB boards 8+1 100Mbps router (custom 16-port OpenWRT in production) 16GB RAM (8 x 2GB DDR3) ARM Cortex-A53 CPU Hexa-core Mali450 GPU LAN Bandwidth: 8 x 1000Mbps 64-bit Linux (Alpine Linux)
What kind of hardware will be able to participate in the Testnet?
Any computer can be added to the Skywire Public Testnet, set up as a node, and use the functions of Skywire. However, only a limited number of machines will be whitelisted (including Official Skyminers and some DIY Skyminers as noted above) and receive rewards during the testing stage. Machines not on the whitelist will still be able to participate in the network and access the full service of the network, however they will not receive rewards.
Is a dedicated router part of the required spec? For example, if someone builds a miner that meets spec with 8 nodes and a switch, but just has it connected directly to their home/ISP Router will they be whitelisted?
They could be whitelisted. Official miner is just a benchmark and DIY Skyminer doesn’t require the exact same setup as the benchmark.
Is there any difference between Official Skyminers and DIY Skyminers?
In the current testnet, only ONE miner is allowed to be whitelisted per IP address. In the future when rewards is proportional to the bandwidth you are producing for the network, any rewards is directly tied to the bandwidth you are producing so you may have any number of miners as you want with a single IP address.
How do I setup the software for my Official Skyminer?
Those who have previously installed a version of Skywire can update directly in the software. (Note: if the update fails, please reinstall it by following the instructions on Github: https://github.com/skycoin/skywire). Please remember that only whitelisted miner will receive rewards at this stage. However, you can still access the same VPN functions with Skywire along with everyone else!
What do we do after installation?
It is simple! All you have to do is keep the node online so that other Skywire nodes can connect to yours, as we perform network tests and do all the heavy lifting from our end. Grab a drink, sit back, relax and enjoy using the new internet :).
What will be the reward mechanism for running nodes?
At the moment, whitelisted miners will require a minimum of 75% up time per month to receive that month's rewards. The reward ratio will be set carefully going forward. We will first need a rigorous dataset as a point of reference and will be adjusting the rate continuously throughout the process as the economic model gets established. It is important to note that in the current testnet, Skycoin is rewarded independent to bandwidth production of your miner. In the future, Coin Hours will be earned instead of Skycoin depending on the bandwidth you are providing to the Skywire network.
Do we get extra rewards for maintaining >75% up-time?
No you will not. Everyone who maintain >75% up-time will receive the same rewards.
What is the mining rewards?
For the first month, the rewards was set as 96 SKY per official miner and 6 SKY per node up to 48 SKY for DIY miners. As we figure out the most optimal economics to incentivise a global meshnet of hardware infrastructure for this new internet, this number will change depending on the growth of our network.
Does Skyminer mine Coin Hours?
Coin Hours is a separate currency produced by Skycoin. Each Skycoin produce 1 Coin Hour every hour. Coin Hour will be used in the future to pay for transactions in the Skycoin economy such as Skywire and Kitty Cash accessories. Once we move onto the main net, Skyminer will produce Coin Hour instead.
How often does the whitelist uptime monitoring refreshes?
It refreshes at the beginning of every month.
Do we need to be whitelisted once we move to the mainnet?
No. Anyone is free to join and start earning by contributing valuable resources (bandwidth, storage and computation) once we are on the mainnet.
When I visit 192.168.0.1 on my browser, I see my own home's router instead of the miner's router. How do I fix this?
There is a conflict in the LAN configuration between your home's router and the Skyminer's router. You need to change the miner's router IP to another IP address OR you can change the miner's router into your own network switch
Is Skycoin node the same as Skywire node?
No. They are a completely separate and independent decentralised network. Skywire node is responsible for sending, receiving and transmitting data. Skycoin node is only for validating transactions and the state of the blockchain much like Bitcoin nodes.
Do I need to leave my computer on once I have configured my Skyminer properly?
No you do not. As long as your home's internet modem is still turned on and providing bandwidth to the router and CPU boards on your Skyminer, your own personal computer doesn't need to be left on. That's the idea of having an independent purpose-built Skyminer.
My Skyminer was turned off because my power went out, will I need to re-register on the whitelist and lose my rewards?
No you will not. Installing Skywire on a Pi is like installing a program on your personal computer. You don't need to reinstall the program every time you turned off your computer.
Will changing the router affect my node's public key?
No it will not. The public key is attached to the software installed on the Pi. If you reinstall the program (eg. reflashing) then you will have a new set of public keys. If you have reflashed after submitting your public keys on the whitelist, you have to contact us to change you public keys otherwise you will not receive your rewards.
I can't log on to the router of my official Skyminer.
Ensure all the networking cables are attached correctly, turn off any wifi and VPN on your computer, ping 192.168.0.1 via the Windows or Mac command prompt interface. If you are able to ping your router than please wait 2 minutes and try again. Contact our community managers for support for more help on the Skywire Telegram
I have set my Skyminer's router's address on WLAN as 192.168.2.104. However, when I tried to access 192.168.2.104:8000 via another computer I am unable to connect?
Port 8000 is an address set for the Manager node's board. It isn't part of the router's address. If you haven't set up the port forwarding correctly, when you try to access from another LAN network (outside of the Skyminer's own LAN), you will not be able to connect to the miner's router. If you are uncertain about how to set up portfowarding rules, contact our helpful Skywire community managers for more support.
What cables do I need for the Miner's WAN port?
The same as your home's internet cable. We recommend CAT 5e or better cables.
I successfully used Skywire to access the internet by using the bandwidth provided by another peer. However after a while, I was unable to continue using the connection and have to reconnect, why is that?
This is normal. If there is a period of bandwidth inactivity, the connection will disconnect automatically. Another reason is that the peer you have connected to is providing a sub-par connection.
How do I know what my mining nodes' IP addresses are?
Enter the Skywire manager interface installed on the Manager node, turn on 'Terminal', and enter the prompt "ifconfig" in the command prompt. It will return with the IP of that exact node you are accessing.
The green lights on all 8 boards are lit. Why do I only see 7 nodes on my Skywire Manager interface?
First check if all 8 lights are lit on the miner's router. If not then check all connections are properly installed. Then use the "ifconfig" method mentioned above to identify exactly which board is not connected. Restart and reflash that SD card and try again.
If someone else has connected to my node, can I be hacked by them?
No they can't. Due to the restrictions set up in the software code, peers are not able to visit anywhere outside of the node they are connected to. __
Where can we learn more about Skywire and join the discussions?
https://preview.redd.it/e94g0ygs1dg11.jpg?width=1200&format=pjpg&auto=webp&s=6bb99b59408b156655efc3c27d3b1b26224c2458 Hello to all and welcome to my ICO review . For those of you who don’t know me yet, let me introduce myself. My name is Funke and I’m an experienced writer who enjoys what he does very much. I’m here to give you all the information you might need or want about The latest Smart Mobile Mining. I plan to make them “easy read, to the point” because what’s important to me is that we all comprehend and absorb the information and actually want to be here learning together about something new. Today, let us all learn and experience together all about MIB Coin. If you don’t know anything about them, don’t run away, that’s why I’m here, to change that within the next few minutes. Before I begin, I just want to quickly mention that I will be including LINKS so that YOU can do your own research or whatever you feel you need. I feel it’s best to have all of the information in one place, in an organised manner, rather than dotted about the page making what you want hard to find or confusing. All these links will be put at the BOTTOM of this page to keep things easy. MIB Coin is a new SmartX blockchain platform that gives everybody the ability to get involved with mining through smartphones. Also, to make it easy for people to connect with businesses in a much easier way. MIB has focused on the fundamental solution of maintaining an agile and sustained blockchain network with continuity in the long term, to expand features like a robust platform, and to distribute and maintain nodes with a low cost. With ‘Smart Mining’, it reduces the power consumption by about 15,000% compared to the existing method by significantly lowering the high power consumption, which is a critical issue in mining. Therefore, it reduces the cost of maintaining the blockchain and the social cost. I am going to get straight to the point of who they are since this is such an important factor, as I like to put it, “The Brains behind MIB Coin”. If the team behind the product or service in question haven’t the resources or knowledge, then we know the product or service delivered will possibly not have the quality it should have. We need to know two very important things when we look at a new starting up company: Who are the people/team behind this? What experience and knowledge do they possess? Okay, so, because I’ve already done all the research needed beforehand and thoroughly MIB Coin I can very happily confirm that this company has a fabulous team onboard. Here they are: MIB Coin will grow into a real cryptocurrency used by people around the world as the means of exchange, payment, saving, value evaluation function and communication of hard currency. MIB coins use a mobile-based platform, which requires only the minimum amount of power consumed in everyday life. This is because MIB coins only need the power by mobile CPUs compared to the vast amount used by BITCOIN. The SmartX blockchain platform is expected to become a globalized cryptocurrency that reduces blockchain’s maintenance cost remarkably by applying a lightweight mobile-based hash algorithm aiming for convenience, economy and popularity. This will not be a simple platform change but a change of the cryptocurrency ecosystem. https://preview.redd.it/nugmnqjt1dg11.png?width=495&format=png&auto=webp&s=be18938112749998f780f7f5b77fbee9985d211b MIB SmartX Blockchain Platform Structure: It is not a dedicated mining machine (ASIC or GPU). It is mining on the smartphone. Mining that was exclusive to only certain countries and companies are now available to all. Everyone can participate. An eco-friendly, low-power energy-based mining method solves excessive power consumption issues. Keep it at a minimal cost instead of existing high cost blockchain networks. In addition, a variety of tokens will be created on the SmartX Blockchain Platform MIB platform. HOW MIB COIN IS MINED The MIB Coin (www.mibcoin.io) is designed to be mined specifically on mobile devices and can not be mined by powerful mining machines. Mining MIB Coin requires 99.24% less processing power compared to traditional miners, requires very little electricity and the total cost of the process is reduced in a way never seen before. This is simply fantastic because it goes beyond cost reduction by being totally eco-friendly and sustainable. Mining will be accessible to everyone from anywhere with an internet connection from smartphones. Just download the app from the app store, register and start mining. It’s fast, no need to know programming or handle complicated settings that even a 5-year-old could do. The process is completely safe and involves no risk of damage to devices. TOKEN AND ICO DETAILS Token name: MIB Platform: SmartX Blockchain Token price: 1 ETH = 1200 MIB Total supply: 600,000,000 MIB Public ICO: Jul 20, 2018 ~ Aug 18, 2018 MIB ALLOCATION https://preview.redd.it/g3f50mqu1dg11.png?width=1097&format=png&auto=webp&s=caaeafd09162cb56125b59135294729797cea2b2 50% of MIB Coin is allocated as Smart Mining and is available to anyone easily. 27.51% of MIB Coin is used for the maintenance and management of MIB’ network ecosystem. More specifically, 11.67% for Reserve and 15.84% for Extra Marketing. 22.49% of MIB Coin is distributed to MIB Pre-Sales, ICO, and the stakeholders. More specifically, 5.83% for Pre-Sales, 8.33% for ICO, 3.33% for the investors, 3.33% for the advisors, and 1.67% for the team are distributed USE OF PROCEEDS 35% will be kept in capital reserves 35% will be used for technology development of SmartX Blockchain, Smart Contract, DAPP, token platform, connected platform and security. 10% will be used for the operational cost of marketing, accounting, legal and regulatory purposes 10% will be used for MIB’s global marketing, social media and branding 10% will be used to establish strategic partners, offices, and business development around the world ROADMAP AND DEVELOPMENT PERSPECTIVE https://preview.redd.it/rgiadw8v1dg11.png?width=817&format=png&auto=webp&s=f33358cf76b228261ac9f502f684c84481a59e06 TEAM https://preview.redd.it/5jtzx3ov1dg11.png?width=991&format=png&auto=webp&s=ac494e51546771ed1f7b752a3f1b683d70018d62 In conclusion, anyone can mine with a smartphone. Benchmark your smartphone and allocate a hash rate. MIB is designed so that your smartphone is able to withstand the computational complexity required for mining and protect your smartphone from overheating and damaging the hardware. It`s possible with one smartphone.Reduced power by 99.24% compared to existing mining machines. This project will be successful and we will all benefit from it.
Blockchain technology is a transparent digital transaction book and records that are immune to modification or deletion. Offering additional features of increased security, cost reduction, time efficiency and error tolerance, the chain-chain grew, fluctuated in 2017. The utility of blockchain technology is unlimited, triggering an increase in the list of companies, industries and government studying its potential adoption. A block-chain is an immutable public book that records digital transactions. ABOUT MIB COIN MIB Coin is a new SmartX blockchain platform that gives everybody the ability to get involved with mining through smartphones. Also, to make it easy for people to connect with businesses in a much easier way. MIB has focused on the fundamental solution of maintaining an agile and sustained blockchain network with continuity in the long term, to expand features like a robust platform, and to distribute and maintain nodes with a low cost. With ‘Smart Mining’, it reduces the power consumption by about 15,000% compared to the existing method by significantly lowering the high power consumption, which is a critical issue in mining. Therefore, it reduces the cost of maintaining the blockchain and the social cost. Through this, it aims at decentralizing the mining by enabling areas where power generation is not enough such as Central Asia, Africa, South America, and Southeast Asia to participate in mining. MIB Coin will grow into a real cryptocurrency used by people around the world as the means of exchange, payment, saving, value evaluation function and communication of hard currency. SmartX BLOCKCHAIN PLATFORM SmartX will support creating and executing smart contracts, which in turn are changing the way agreements are made since you do not need an intermediary because the contracts themselves follow specific rules and are automatically executed on their own terms. This allows for a multitude of applications such as insurance contracts, distribution of profits and endless possibilities. More significant opportunities to participate in mining are given to more countries and people by using smartphones which are already available all over the world in billions. The problem of polarization characterized by the concentration on specialized mining firms or specific nations can be resolved by distributed mining. Mining today is becoming monopolized by large miners with infrastructure and funding and countries with a seamless power supply are becoming the major markets. Cutting back high cost of ASIC and GPU mining methods into a more economical mobile-based method, the power consumption rate is markedly decreased resulting in the most effective P2P network with a minimum cost. MIB mobile cryptocurrency ecosystem can be proposed to many cryptocurrencies that have lost their original functions due to mining and blockchain-based technical difficulties, enabling them to improve smoothly and continuously. MIB coins use a mobile-based platform, which requires only the minimum amount of power consumed in everyday life. This is because MIB coins only need the power by mobile CPUs compared to the vast amount used by BITCOIN. The SmartX blockchain platform is expected to become a globalized cryptocurrency that reduces blockchain's maintenance cost remarkably by applying a lightweight mobile-based hash algorithm aiming for convenience, economy and popularity. This will not be a simple platform change but a change of the cryptocurrency ecosystem. MIB SmartX Blockchain Platform Structure: It is not a dedicated mining machine (ASIC or GPU). It is mining on the smartphone. Mining that was exclusive to only certain countries and companies are now available to all. Everyone can participate. An eco-friendly, low-power energy-based mining method solves excessive power consumption issues. Keep it at a minimal cost instead of existing high cost blockchain networks. In addition, a variety of tokens will be created on the SmartX Blockchain Platform MIB platform. HOW MIB COIN IS MINED The MIB Coin (www.mibcoin.io) is designed to be mined specifically on mobile devices and can not be mined by powerful mining machines. Mining MIB Coin requires 99.24% less processing power compared to traditional miners, requires very little electricity and the total cost of the process is reduced in a way never seen before. This is simply fantastic because it goes beyond cost reduction by being totally eco-friendly and sustainable. Mining will be accessible to everyone from anywhere with an internet connection from smartphones. Just download the app from the app store, register and start mining. It’s fast, no need to know programming or handle complicated settings that even a 5-year-old could do. The process is completely safe and involves no risk of damage to devices. Anyone can mine with a smartphone. Benchmark your smartphone and allocate a hash rate. MIB is designed so that your smartphone is able to withstand the computational complexity required for mining and protect your smartphone from overheating and damaging the hardware. It`s possible with one smartphone.Reduced power by 99.24% compared to existing mining machines. TOKEN AND ICO DETAILS Token name: MIB Platform: SmartX Blockchain Token price: 1 ETH = 1200 MIB Total supply: 600,000,000 MIB Public ICO: Jul 20, 2018 ~ Aug 10, 2018 MIB ALLOCATION 50% of MIB Coin is allocated as Smart Mining and is available to anyone easily. 27.51% of MIB Coin is used for the maintenance and management of MIB’ network ecosystem. More specifically, 11.67% for Reserve and 15.84% for Extra Marketing. 22.49% of MIB Coin is distributed to MIB Pre-Sales, ICO, and the stakeholders. More specifically, 5.83% for Pre-Sales, 8.33% for ICO, 3.33% for the investors, 3.33% for the advisors, and 1.67% for the team are distributed USE OF PROCEEDS 35% will be kept in capital reserves 35% will be used for technology development of SmartX Blockchain, Smart Contract, DAPP, token platform, connected platform and security. 10% will be used for the operational cost of marketing, accounting, legal and regulatory purposes 10% will be used for MIB’s global marketing, social media and branding 10% will be used to establish strategic partners, offices, and business development around the world Above you can see the roadmap — how the team sees their nearest and long-term future. TEAM MEMBERS AND ADVISORY MEMBERS TO THE PROJECT Team seems to be the strongest part of this project. It consists of innovative and talented people. Of course I cannot complain if we talk about their professional level, guys indeed are experts on their field. MIB will make everybody a miner as long as you make use of a smartphone , it reduces excessive power consumption by a greater percentage and also the high cost of mining equipment becomes a thing of the past. It has developed a low-power blockchain system that will help to achieve this purpose. The SmartX blockchain network can bring a significant reduction in the extremely high power consumption rate that often causes social problems, resulting in social cost savings needed to maintain the blockchain. Therefore, it would definately end up being a great success, so investors can also benefit from it.
I've been working on a WPF project that does a lot of DirectX processing and everything works perfectly on my development laptop or any comparably spec'd system. When running on lower end systems - specifically anything without a dedicated GPU, my users report crashes that I simply can't reproduce. I'm a digital nomad, so all I have with me is my laptop. Is there a way that I can throttle my computer to reproduce these sorts of issues? I've tried running GPU Benchmarks or bitcoin miners in the background and it makes my computer run so slowly that it's unusable, and I still can't replicate the issues. Thanks and have a great day.
Prices include shipping, taxes, and discounts when available.
Generated by PCPartPicker 2013-05-03 07:58 EDT-0400
(Edit: Updated to Trinity CPU and compatible mobo w/ USB3.0. Case w/ 3.0 headers. Dedicated PSU. More expensive monitor.) Hey all, first time posting a thread on BAPC. Love the sub. Anyway, I have a bit of an unusual goal (for this sub.) I'm setting up a basic personal secure PC/server. This PC has a specific purpose: Check secure email, potentially upload/transfer ePHI (patient data), and potentially also secure online shopping/bitcoin/etc. The last one I may prefer to just use boot drives on my current PC. The data doesn't need to be backed up or saved in duplicate via a RAID or anything, I'm just using the PC so I can communicate with other clinicians or work on write-ups from home, etc. Oh, and I'm very interested in potentially doing a health technology project later which would be optimized by my being able to store ePHI at home, including video/voice recordings. I'm doing this because I don't want to go through the trouble of encrypting (and slowing) my current PC. Setting up an encrypted virtualized partition/OS is not sufficient (has to be whole-drive encryption.) It's institutional/governmental rules, not mine. I've seen a few all-in-one setups ~$450 and a few basic PC setups for $350, but I can't think of much cheaper than building myself. Last thing, I'm currently planning on going through my university to purchase Win7Ult or Win8Pro, but I'm also considering a Linux or *BSD distro, but I'm not sure if that's necessary if I have the complete drive encrypted. I don't think I need to do a double-hidden partition thing, either. Nobody's gonna try that hard to steal my patient data. One of the main factors will be ease of use and security for transferring files from my SGS3 to my secure PC (from the hospital, without using another computer, unless you know of an easy way to automatically remove files from an encrypted droid and upload them to the computer via usb... could be useful.) Would love some suggestions, especially as to the OS choice, software for secure upload/download, or if I'm missing either similar performance for a lower price or higher performance for the same price. 1TB+ storage is necessary. I'd describe my computer proficiency as... can do some programming, have played around with some SSH/FTP, etc., built a PC once a while ago (no issues). Not an expert otherwise, by any means. Thanks y'all! Random extraneous factors: PCPP recommends some online stores that don't price shipping up front, so I removed those stores. Not worth my time to try and optimize shipping costs by going to their cart every time. Also, I know I could trim off an extra $40-70 by downgrading to 1GB or 2GB RAM and a smaller HDD, etc., but I think there are diminishing returns on that level of frugality. Parts were chosen mainly by lowest price on PCPP, except with the rule out of very low performance (Essentially
PSA: Poor performance could be Steam Directory Bitcoin Malware "Steam Reversed"
So the title more or less says it all. I've noticed that my new GTX980 has been performing poorly. I even made a post in this sub about it's poor benchmark and newdawn demo performance.. Well as it turns out in the last couple months some new malware has hit that actually does something interesting. It uses a portion of your GPU's performance towards bitcoin mining. It's actually pretty genius. The malware installs under an appdata/../steam directory which is smart as users with Steam are probably more likely to have a dedicated GPU than those who don't. The virus infects a few hundreds, thousands, etc systems and even the relatively weak power of a gpu (compared to a dedicated bitcoin fpga) becomes substantial. So rather than causing you immediate trouble or stealing your money/credentials directly they instead steal it through your system resources and electrical bill For me this meant a useable desktop (with no noticeable slow down) but benchmarks scored substantially lower. It also meant seemingly random pings in dynamic GPU clocks. I found out I had "Steam Reversed" entirely by accident when giving the Webroot trial another go (hadn't used it in a while been using MBAM, and BitDefender mostly as of late). Webroot found it in ~20 seconds with it's quick scan. After allowing it to remove the malware (and double checking for myself as well as running other utilities/full scans) my system is now scoring substantially better and the demo's are running much smoother. It's no surprise that malware can affect system performance (obviously) but this one was a bit sneaky as it wasn't made to be directly malicious (and therefore more noticeable). Anyways supposedly MBAM will find it now, and webroot obviously will as well. If you're concerned about low performance and don't believe it's the drivers it may be worth checking out. http://steamcommunity.com/discussions/forum/1/35221584425122691/ http://www.reddit.com/Steam/comments/2e7jk8/the_new_appdataroamingsteamreversedsteamexe/
This may be a long post but I wanted to get a few things off my chest. Some of you may not like what I'm saying but if that's the case at least take the time to discuss it rather than downvoting me. I think that's what communities on Reddit lack - discussion is how we progress and sometimes it feels like all Reddit does is downvote instead of actually talking up. Firstly - Bitcoin in the media. It's no big news that Bitcoin was synonymous with Silk Road and the Deep Web for a while. You couldn't have one without the other. This lead to a lot of negative press for Bitcoin itself. I think it was handled quiet well, we showed the majority of people that Bitcoin was useable in a mainstream setting but I don't think we did enough. How often do we see the good features of Bitcoins like:
Free / automatic fraud prevention. You can be sure that your payment won't be chargeback from fraudsters. *Very easy implementation with Coinbase and other merchants. Fees are very low compared to credit card processors.
Accept payments worldwide. Anyone who can download the client can make a payment.
Can be set up usually within 10 minutes.
How often do we see these features? I recently saw a sign in my local shop saying they couldn't accept credit card payments due to high fees. Imagine if they knew about Bitcoin, as long as they have a computer or even access to one they could attract a wider audience. I think we can do a lot more to give out a clearer message about Bitcoin. I think a lot of the fear comes from the fact that people don't understand it. If we spend more time educating then we can get a more positive image out there. Secondly - Miners, their profitability and other issues.
The next thing I'd like to bring up is the mining community. When people were using CPUs and GPUs to mine we all knew that there was very little chance of profit. BTC was no where near as popular as it is now and you could end up spending thousands on building a rig with hardly any chance of making profit - but people still did it for fun. It was a interesting concept how your computing time would make the network stronger and secure and in return you would be repaid in miner fees / generating blocks. Now it's all about profit. Yes, it's understandable that if you spend $5000 on a miner you want to make profit of it but people see it as a simple money in profit out scheme. That's now how you should see it. Your making an investement. If you look at ROI in terms of Bitcoin chances are you won't break even any time soon. This was the same with GPUs and it's the same with ASIC. However if you pay for something like a USB ASIC miner you shouldn't see it that way. You're paying a small amount for xGH/s (or MH/s) for a fraction of what it would actually cost you. When I brought my USB miner I spent about $20. If I were to buy that power with a GPU it'd cost me way more and even more to keep it running in terms of power costs. Another way of looking at it is if you were an early adopter. 50BTC rewards were nothing back in the day. If I hoarded all my coins and spend that 50BTC buying hardware it's not that bad because those 50BTC cost me nothing. Yes I could have sold it and gotten way more but if it means I'm securing hardware to mine even more coins to hoard then it could be very profitable. What if the hardware I brought mines me 25 coins now, I've done a good job imho.
Companies themselves have let us down with the exception of one or two. BFL fucked up. Avalon pretty much cheated the whole community. KNC Miner and ASICMiner are the only ones I know that kept the promises as close as they could. As an investor, investing in any of the miners with my USD wouldn't be very appealing to me. It's all shambles and I would prefer to invest my USD in buying BTC and holding it rather than buying mining software. To me it sounds like all the companies are interesting in selling during the gold rush. "During the gold rush, sell shovels." Which seems to apply to most of the companies these days.
Thirdly - exchanges and how they work.
A lot of people seem to give MTGox shit because of their processing time. What you have to understand is that whenever fiat is involved there are regulations. Bitcoin is still a touchy subject in the US. This makes it dangerous to anyone in govt. who doesn't understand it. Their system works though.
MTGox, BTC-e use something called an order book. Meaning BTC will only be worth what people are willing to pay for it. The highest bid price will raise the last price. Simple as that. It's good because it means they can't artificially raise the price..but it also means trading bots with the relevant APIs can change the price by making small orders. They take a small fee in BTC / USD which imho sounds fair.
LocalBitcoins is failing. People are charging premiums way above even Gox but only paying below Bitstamp to buy the coins. It's all about being greedy. LocalBitcoins used to be cheaper for a while but now it's just being ruined by greed. Unless travel costs are included, most people use just UK Bank Transfer which costs you NOTHING. You could charge a .10p fee each way, keep it at Bitstamp prices and easily attract more customers but people choose to be greedy.
Bitcoin for devs and merchants. I think a lot of work needs to be done for Bitcoin devs. Unlike other documentation, I found Bitcoin very hard to follow when it comes to dev. I understand the basics of coding and a few languages - not a much but enough to say make a web page or a automated program in C#. Imagine if you had Bitcoin dev dedicated websites like w3schools does for languages (I know w3 schools is a very bad benchmark but whatever). If more people understood how dev works then more people would attempt to get into it. I think a lot of work needs to be done within the BTC community itself before we even think of making it mainstream. Even if the last price reaches $1000 it's useless if people don't know as much as should do.
[Build Help] What can I do to optimize my build for what I do?
Hey /buildapc! I've posted a few times before but this time I'm actually ready to buy parts (finally have enough money). Besides trying to game on the max possible settings + whatever mods mods when it comes to games like Planetside 2, Skyrim, Bioshock 1-2, Borderlands 1-2, and future games, I'll be doing a few other things What I'll be doing : what's important for the task
penetration testing : usually stresses the CPU
virtualization : eats up CPU cores, requires a little extra RAM
compiling : also CPU
programming : requires nothing
bitcoin mining : GPU, there is no max in power
burning a lot of ISOs to disks : optical drive
I'll also be doing video editing at some point. I will be overclocking whatever I can. I have 2 monitors, but will probably be using 1 while gaming, the other on reddit or something. I'll be pen testing with the VMs as well, so I need strong cores. Here's my build, finally: PCPartPicker part list / Price breakdown by merchant / Benchmarks
Prices include shipping, taxes, and discounts when available.
Generated by PCPartPicker 2013-01-09 00:52 EST-0500
CPU I'm a little concerned that the CPU may not be able to do what I'm looking for, but I've heard that Phenom II is a great processor. Is it good enough for me? I read a comment somewhere saying there's an upgrade option for only 10 dollars more or something. I'm confused with how AMD names their processors these days, though....and lost the comment. RAM Will there be benefits for me with 1600 over 1333? Heard that faster RAM helps overclocking. True/false? MOBO I think it may be OP for the job. How do I pick a MOBO that's going to work great for overclocking, and is unlocked? I have an Samsung SSD 840 Pro, so at least 1 SATA III 6Gb/s would be nice. Some USB 3 is nice, but doesn't have to be too fancy. SOUND CARD Don't totally need one, but tired of shit audio, and headset I have eyes on works best with a dedicated card. Not really all that expensive, anyways...... CASE It's hard to find a case that isn't bulbus, is painted on the inside, has nice cable management, and isn't expensive.... anyone know of any great cases that don;t cost as much as my processor? By bulbus I mean HAF and a lot of the Cooler Master ones. Price Range About the same price, but lower is always accepted. Slightly higher is also okay.
Backups: Luckily, I don't do desktop support. We have another IT group that does that, I'm completely independent from them and I only have to take care of servers (and my own desktop). The physical servers are backed up to tape with Bacula. Our virtual servers are backed up with Veeam. My own desktop is backed up to my NAS share using synctoy (yes, i use windows on my desktop).
One off systems: As in physical servers built by hand? 0. I'm pretty much a Fujitsu shop with a few Dells. I definitely don't have time to be piecing servers together. disk space: only a few TB per server. I think the better answer would be that we have an Isilon X200 cluster that is 140 TB.
I guess in that case I only manage a handful of physical servers and a few VMs that are made for running one special piece of software or analyze data from one piece of scientific equipment. We have many other scientific devices that are attached to PCs that are "community" devices, but I don't have to manage them. and we've got a microscopy group that is separate from me too, with their own machines and devices.
First, just to clarify, we're going to 10G from the 1G we have right now. I'm not our main network guy, so I'm not entirely sure but I doubt we'll change the MTU simply because we don't have a remote site so the majority of our traffic is regular internet traffic.
As for our backend network, I do use jumbo frames on a couple VLANs for our storage.
Intel makes good chips and they do keep pushing technology forward, but they will never do overclockers any favors. They will always be doing whatever they can to make money. AMD will also do the same thing. Intel seems to think enthusiast solely means "deep pockets". At the same time, there always seems to be a lot of "the sky is falling" reporting done by many tech journalists. Intel hasn't completely forgotten about overclockers and I don't think they ever will completely let that group disappear. And really, what incentive does Intel have to completely lock out overclockers? Sure, deny us our warranty, we'll go ahead and buy another chip and give you more money. How could you deny that as a company? as for overclocking headroom decreases, one can only hope that means we've got a whole new architecture coming out soon, something like the transition from Pentium 4 to Core.
At home I've got a 1u dell poweredge sitting in a closet which is my main server. I run bageez.us off it which was supposed to be my way of giving back to the community, by running a Linux torrent site. Other than that I've got two htpcs running Debian, a desktop windows machine for gaming/reviewing hardware, and a file server with 8 tb running Debian and KVM with a few Debian VMS.
Looks like I let the SSL cert expire. I'll fix that tomorrow. It works on my end but I think I want to recode a few things and possibly get it to work with other trackers. Right now the torrents will only work with my local tracker.
I think what really got me the best knowledge was forcing myself to use a "less polished" distro as my main rig for a few years. Once you are forced to learn, you'll learn quickly. Picking up an rhcsa book will help too even if you don't plan on taking the exam. Go through it and do the exercises. Install a distro, set it up, then format and do it all over. You can use virtual box for the same result without killing your main rig.
No, but I wish I did. I stopped using it because the GPU support in Linux was better on my desktop, and now I work mostly with CentOS, and it would be a lot of work to change 100ish servers over to FreeBSD.
Nice. I've heard that ROCKS becomes a bear at scale, but for now it's pretty simple and quick. My plan is to keep adding another 18 nodes every year (one full blade cluster) every year, as long as I can get funding, so I'm keeping my eyes open for other solutions for provisioning. Bright cluster manager is another one I have on my radar.
1st step i'd do is remove all nonessential parts from the computer. Leave the cpu and 1 stick of ram. Pull out the graphics card, don't connect any hard drives or cd drives. On the back, connect the monitor to the on board video card and connect the keyboard. Does it power on? Do you get any error messages other than it saying there is no OS? Then power down and connect things one by one until you figure out what part is causing the problem. If you think it's the drivers, you can boot into safe mode (i hope windows 8 still has that, press f8 while booting), then run Driver Sweeper, to remove the graphics drivers. I haven't tried this on windows 8 so i'm not sure if it will run or not. I don't think you need to do a full format and reinstall.
For benchmarking, mainly. The 3770k was our standard platform for reviews when I bought it. The rest is leftovers from various reviews. We don't get paid, so basically we work for hardware when we write reviews, more or less.
Well, it would work just as it does on any other group of computers. I'd have to run one client on each computer and they'd all check back to get their own workloads, so it would really take out the "cluster" usage and turn them just into regular blade servers.
I listed everything I could think of that I've done that was computesysadmin related. I had administered several web servers over the years, and experimented with many different distributions as my daily driver on my main desktop, so I was very comfortable on the command line and with day to day tasks. I was asked a few 'test' questions on the interview but I think they were more to gauge exactly what i did and didn't have experience with, not so much to make or break me.
I have start playing with configuration management, but haven't gotten anything in production yet. I only provision new VMs every once in a while, and once the computer nodes are up they are pretty stable.
LN2, at the benching party in philly last year. We definitely need to get one of those on schedule again. Also, my work has LN2 and D-ice sitting around but I haven't asked if it's ok for me to play with those yet. One day, i'll ask, and it will be awesome if they say yes. fingers crossed.
If it's not on my computer or benching station, it's in my closet. And my wife doesn't like the amount of computer stuff in my closet, so I'm sure I'll start looking for some way to recycle stuff soon.
Well, you could get yourself a RHCSA prep book (linked to the one i have and found useful) and go through all of the exercises. The way I learned was basically to set up my own servers, either physical or virtual, at home, and run them. I think FreeBSD, Gentoo, and Slackware were the most beneficial to me in that they don't really make choices for you, so you have to configure things for yourself which forces you to read the documentation and learn. They all have excellent documentation, btw. If you want to go a step further, linux from scratch will really teach you about the operating system from the ground up.
From there, come up with little projects for yourself. Like making a home NAS, setup NFS and Samba shares, install XBMC on a HTPC and hook it up to your tv to stream movies and music. Setup a webserver and owncloud. Stuff like that.
I was 19 when I first made that half life/counterstrike server. I didn't even know what ssh was and it took a good amount of explaining for me to finally understand. The freebsd documentation is amazing and will walk you through just about everything step by step. To get NAT configured I had to use another how to but setting up that server taught me a ton.
Most likely you will want to stay around 1.6v. I'm not very familiar with that chip specifically so I'd check hwbot to see what other people have posted and go by that. Obviously remember that not all chips are the same, so you can't expect to get exactly what other people get.
Back when I played CS in the dorm freshman year of college, I used to get killed all the time. So I started calling myself "jack splat", as a play on the nursery rhyme (jack sprat), then shortened it to 'splat' on most of the websites I signed up for.
I definitely have a few and luckily they aren't that bad. One of my first few months, I decided to connect this wireless ap to the network to test it out one morning. As I was being awesome managing the cable to make it look clean, one of the security guards came into the server room and said they had no internet. I looked at our switches and they were all lit up solid. By hooking up the ap, which had spanning tree turned on, I took down the network of the entire building.
Luckily, all I had to do was unplug it and everything went back to normal. I then set up a spare switch at my desk and played with it before figuring out that STP needed to be disabled on the AP. Now it's been running for over a year without incident.
On the market for the best mining CPU 2020 has to offer? It's worth knowing that while over the past two years, cryto-currencies such as Bitcoin have peaked and dipped, mining for crypto ... A dedicated graphics card, or a GPU is not an essential hardware component. Computers are perfectly capable of running without them using just the on-board graphics card. In fact, you will find plenty of systems available today that do not have a dedicated GPU but come with SSDs and the latest processors. That said, if you need to play games, you cannot do without a GPU. Games always tell you ... In choosing the most efficient GPU the most important thing is striking a balance between how powerful you want your rig to be and how much you are willing to spend on the GPU itself and the electricity. Many of these costs will vary by country. Is it cheap to buy GPUs in your country? Cost of electricity by country - most expensive to least. Since most GPUs are sold on Amazon and eBay, if you ... GPU Mining. GPU (Graphics Processing Unit) is also known as video cards. It is not as powerful as ASIC, but GPU is more flexible in their application. The GPUs are often used in computer games for smooth flow of 3D animation and video. GPU can mine much faster than CPU. In order to mine Bitcoin, you need to have at least one GPU installed on ... Bitcoin bulls have repeatedly claimed that the cryptocurrency price will hit $20,000 shortly. While many analysts believe it would happen, some disagree. Yashu Gola 16 mins ago; Bitcoin Creates Strong Tailwind as Traders Rotate Capital... Bitcoin’s price has begun consolidating following its immense surge seen throughout the past 48 hours Yesterday afternoon, it rallied as high as $13,200
Welcome to TechTeamGB! Comment, Like & Subscribe! In this episode, Andrew, Tom and Sonny go over how to install a Nvidia Physx card with an AMD Graphics card. Thanks for watching! Links: www ... Lets learn how to Benchmark ProgPow in 2020. A number of coins are considering utilizing this algo over the coming months, let's help them out. Lets test our hardware to see how it stacks up and ... Share your videos with friends, family, and the world Will the P102-100 save the gaming industry? Can mining specific cards really help anyone? Or do they just treat a symptom of a much larger systemic issue? • P102 Source: https://goo.gl/vEhspS ... Nicehash Multi CPU / GPU Cryptocoin Miner & Benchmark Software - Duration: 14:45. IMineBlocks 104,959 views. 14:45 . LOOPHOLE IN THE BLOCKCHAIN TO GET ANY PRIVATE KEY, FACT!!! - Duration: 18:36 ...