cpu - Bitcoin mining with Integrated graphics - Super User

Alienware Alpha R1 is 2020

Alienware Alpha R1 in 2020*

Mistyped the title...
This is going to be a simple guide to help any R1 owner upgrade and optimize their Alpha.

Upgradable Parts

(In order of importance)
Storage Unit:
HDD OUT
SSD IN
This is by far the easiest upgrade to make and the most effective.
https://www.newegg.com/p/pl?N=100011693%20600038463
Any of those will work, just needs to be 2.5 Inch SATA.
How to Replace Video

WIFI Card:
This is like a 5-15$ upgrade. Go find any Intel 7265ngw off eBay and replace it with your current WIFI card. If you don’t want to buy used then here.
How to Replace Video

RAM:
Ram prices have tanked because of bitcoin mining, so this has become quite a cheap upgrade as well. I’d recommend 16GB just because why not, but if your tight on cash 8GB is fine.
https://www.newegg.com/p/pl?N=100007609%20601190332%20601342186%20600000401&Order=BESTMATCH
How to Replace Video

CPU:
This required the most research. I’d recommend you look through this first. The wattage of the processor slot only ranges from 35w-50w according to a developer of the Alpha (Source). The socket type is LGA 1150.
If you’re going cheap, the i5-4590t (35w) and i5-4690s (65w) are both great options.
i5-4590t
i5-4690s
The i5-4690t (45w) is also great but is hard to find from a trustworthy source for a reasonable price.
If your willing to spend $100+ then easily the i7-4790t (45w). That is probably the best processor to put in the Alpha. All 45w will be used giving you 3.9 GHz Turbo. The T series apparently runs the best on the R1 according to This Reddit post.
How to Replace Video

GPU:
Coming Soon!

Maxed out Alpha R1 specs: i7-4790t, 1TB Samsung SSD, 16GB DDR3, Nvidia Geforce GTX 860m.
(Upgrading to anything better then that is pointless)

Optimizing the Alpha R1

Peripherals

submitted by Kidd-Valley to AlienwareAlpha [link] [comments]

Mining noob, I have some questions

Hi everyone, a quick intro here: I come from a professional horticulture background. I've been learning about computers, networking, network security and Linux sys. admin for the last two years. I built a bunch of gaming computers for my kids and I with a bonus check I got in fall of 2017, right before the 2017 "bitcoin bubble". By luck I grabbed all my parts before the price of GPU's skyrocketed. All I've been doing though is learning about Linux and game development, learning digital art like 3D modeling, and streaming video games.
I'm now learning to mine ZEC with tpruvot/ccminer 2.3.1 in Ubuntu 20.04 with Nvidia proprietary driver vers. 440 & CUDA toolkit 10.1. I'm just learning how to do this and understand I'm not making a profit. I'ts more a learning experience and a hobby sort of thing for now. I dont really care if the system breaks, I have another computer with AMD RX560 that I work and game on Linux with. I cant mine with the pollaris GPU because I cant install OpenCL. There is no support for 20.04 from catalyst driver as of now.
TL;DR I'm a noob and wondering why my hashrate is what it is. I am only using 1 GPU as of now (Nvidia 1050Ti 4GB) and mining on a pool. I get an average of 140 Sol/s. Is this essentially the same as H/s and is that a normal number for my card? Should I add a 2nd GPU I have if it's only a 1050 2GB? Also, I am using nvtop & htop packages to monitor PC stats, it shows it's using 99% of GPU and 100% of a single core of my CPU (intel i5 6402P @ 3.2GHz) fans and temps are good.
But it shows I'm only using .6GB / 4GB while mining, is that right? Shouldn't it be using more memory? Would it be overkill to mine with CPU miner at the same time as the 2 cards?
Sorry about the essay, and thanks for your time
submitted by starseed-pl to zec [link] [comments]

Is my PC actually good enough...

Hello everyone, Im quite new here so hope I get this right.
I have Pc mostly for gaming, maybe 3 years old, cost me about 800€ then, and I considered it to be mid-tier back then. It has:
Intel(R) Core(TM) i5-6400 CPU @ 2.70 GhZ 12 GB RAM 64 Bit, Windows 10 Nvidia Geforce GT 730
So I actually have several questions but Im gonna try and be compact. The most pressing is that I bought Read Dead Redemption 2 thinking that it would run at least kind of ok on low settings. It doesn't. around 10 fps is the best I can get with everything on low and its obviously unplayable.
Also it seems that lowering settings doesn't realy decrease the graphics, as does the ping. So I get the same ping from low settings or high (roughly) and things still look very shiny on low graphics. I really don't know too much about these topics so I hope I dont ridiculed too badly - my friend told me I propably have a "bitcoin mining thing" thats draining my CPU/GPU. Is this possible/realistic? (Sorry if its a dump question)
The Big question I guess is, how would I start upgrading this PC to make it more viable?
Thanks to everyone in advance! Cheers
submitted by Kaliv_oda to techsupport [link] [comments]

AMD Threadripper 3970x - My adventure so far (Will edit as I go)

I own a gaming community called Unknown Skies gaming (For a little Context) and we are planning a server upgrade. Currently we have a Dell R620 with Dual E5-2690 V2's Running @ 3.3Ghz, has 196GB Ram. The game we host on this rig is called Empyrion Galactic Survival.
The reason for the upgrade is currently the server just isnt high end enough to continue supporting our target playerbase of 100 players, and Performance has dipped due to the game being alpha and the devs are not doing much to improve server side efficiency. Back in 7.0 we could host 150 players on this machine, now it struggles with 60... Yeah big change in 2 years.
After a lot of research... I can clearly see Intel Xeon Gold 6154 CPUs (2 of them) do not outperform even the R9 3950x according to passmark. The price of these used intel servers is stunningly high on the used market... Dual 18 Core XG 6154 Processors with 128gb ram (Dell R740 I think, feel free to correct me) cost right around 10,000$ USD Used give or take 1000$ or so... And get stomped on by an 800$ Single Socket CPU... Thats new and has a warranty... I get that it doesnt support as much memory... But Dang...
So I looked into threadripper... HOLY SHAT!! The 3970x hits 60k+ in passmark... And the ENTIRE system with a custom cooling loop is less than 6 Grand? Thats nuts to me! Amazing Deal! Im sad all my servers and rigs are intel hahahahaha.
What we need, What im aiming for, and my thought process - Feel free to leave thoughts and suggestions..
AMD Threadripper 3970x (Because High Per thread throughput matters due to shit code, and more players on a playfield means more cpu usage on that thread, Lots of threads matter as every playfield opens a new PlayfieldServer.exe and is its own instance so the more spread out players are, the more cores you need to spread out workload. 32 cores 64 threads... NICE)
Corsair Vengeance LPX 256 GB (8 x 32 GB) DDR4-3200 Memory (Each player uses around 1-3gb ram, 2 on average, goal was 100 Players)
ASRock TRX40 TAICHI ATX sTRX4 Motherboard (Thought this was a solid selection given the options I saw)
EVGA 1000 W 80+ Gold Certified Semi-modular ATX Power Supply SHOULD be sufficient with a closed loop water cooler, all SSD's and not requiring a gaming GPU (Plan on a 1050ti / 1050 at most)
Case im using is a modified 4U rackmount case with 3x 120mm fans going in the front, radiator behind those. Could NOT find one that I didnt have to modify. Went with El-cheapo option cause imma be drilling it and modding it internally - Rosewill Server Chassis Server Case Rackmount Case for Bitcoin Mining 4U off ebay I think it will work, and if not I have a file server that would LOVE the home LMAO. So it was well worth the Chance.
Water cooling Solution I settled on was a Thermaltake Floe TR4 Edition. I thought about using a Custom Cooling loop, but by the time I priced everything out the way I would do it... It came out to like 500$ ish LMAO. The above cooler From what I can tell should be sufficient, and should fit in the case i chose. Tight fit, but should fit.
Im excitedly looking forward to the build :) Anyone have any experience with any of this and care to chime in?
submitted by Chilimeat to Amd [link] [comments]

My own (x-post) If you like virtually building PCs, will you help me with mine again?

Tl;Dr: I have an i3, 8 gb of ram, and a GTX 960--help me build a new pc so i can appropriately game again?
A few years back, I got some help building my current rig--but I went TOO budget and need to upgrade.
That build (which I am currently posting from) is here: https://pcpartpicker.com/useG1ng3rBr3dd/saved/#view=bGybt6
Specs:
- Intel i3-6100 CPU
- GeForce GTX 960 GPU
- G-Skill Single Slot 8gb RAM
- Corsair CX 500 PSU
- Asus VX238H-W 23.0" 1920x1080 Monitor (I think I need a higher fps monitor from what I've been told)
- Cooler Master N200 MicroATX Mini Tower Case
- Gigabyte GA-H110M-A Micro ATX LGA1151 Motherboard
Gaming has been increasingly difficult and I want to actually enjoy gaming again without falling victim to being the lowest frame rate/highest latency on every server I enter.
I gamed on my buddy's pc the other day and I was floored. I'm unsure what his specs are, but it made me realize that mine is just holding me back.
I don't need a ProGaming+Streaming+Bitcoin Farming BEAST of a PC. I just want to enjoy gaming again and be able to for a few years with minimal upgrades.
I really like PCpartpicker.com as I'm ignorant and it's highly user-friendly so if you are bored and like doing this stuff, I'd really appreciate the help picking the best parts for what I'm looking for at a price I can justify to myself.
My top-tier budget is around $1200 but that may be insanely high or insanely low for what I'm asking; I'm not really sure. If I can keep and use some of the parts I already have (Tower case, motherboard [maybe], PSU[?]) That'd be awesome but I understand If I can't.
Thank you for reading and thank you in advance if you decide to venture into this for me. I appreciate y'all.
submitted by G1ng3rBr3dd to pcmasterrace [link] [comments]

Any good reason to buy a mobile Ryzen cpu?

I don't buy new laptops and when I do I try and get the most out of my graphics. Before you AMD ass lickers ban me, I like AMD. If I was going to Build a pc it would at least have an AMD cpu and maybe an 5700XT. I bought myself an Alienware,32gb ram, I7 6th gen and gtx 1070 for £600. Why I N T E L and N V I D I A? Easy answer. A Full AMD machine is S H I T. I can still see AMD fanboys saying "OMG FX were still good". NO! You can call me whatever you want but I like the better side. I liked Intel until this year which is when AMD really took over. Anyways I got sidetracked there. I only have one pc (bitcoin miner) with a GTX 1080 and I N T E L 2 quad (mining is gpu but not cpu intensive) and AMD does a very bad job in mining. So I only use a laptop. I'm going to change my laptop in about 3 years and it will be a 2/3 year old but still capable laptop. SO DOES THAT MEAN ITS GOING TO AMD? Cause you know AMD is the best. Here I'm going to dissapoint you. I have to be on the go and I cant have a pc. I never said I like AMD LAPTOP cpu's. In 3 years I'm getting a laptop with a 6 core cpu. Sadly AMD doesn't offer you 6 cores. BUT WAIT 7NM!!! Nope 12nm. BUT...... BUT ITS CHEAPER. I don't care the laptop is going to be cheap anyway. All higher end AMD cpu's have 4 cores. Here I want to start a discution and petition to have 6/8 core mobile Ryzen cpu's. And you know since AMD likes pushing make 12 core mobile cpu's.
submitted by X_dimmy69_X to AyyMD [link] [comments]

Why Runelite's GPU renderer is one of the most important improvements to OSRS ever.

In a world of "gameplay versus graphics", a GPU renderer improves both

Not only does this new GPU renderer improve game responsiveness and framerate by a huge amount, but it's going to be so radically more efficient that it can afford to have longer draw distances. Not just this, but these distant map tiles will be clickable! Very exciting - every single task, skill, and activity will be smoother and more enjoyable.
Disclaimer: This language and information has been simplified for average gamers. Go away, sweaty "AKTHUALLY" brainlets.

OSRS currently uses a CPU renderer straight out of 2003

It's really REALLY bad! At least, by modern standards. It could not be more opposite to what modern computers pursue. It's not Jagex's fault, it's just old... Very VERY old! It's a huge undertaking, and Jagex has been too busy knocking mobile absolutely out of the park, and I'd do the same if I were them - so don't think this is some kind of rag on Jagex. Anyways, some may be surprised that this renderer is still managing to hurt computers today. How can software first written in 2003-2004 (FOR COMPUTERS OF THAT ERA) be laggy and stuttery on computers today? The answer is simple: resizable mode, and individual CPU core speed.
Resizable mode takes a game window that used to be 765x503 (the majority of which used to be a fixed GUI canvas, but not with the new mode!) and renders it at resolutions as high as 3840x2160, maybe even higher. Do you know how many pixels that is? Over 8 million. Do you know how many pixels the original renderer was designed to expect? Just under 390,000. That's over 21x the work being thrown at modern CPUs. Cores aren't anywhere near 21x faster than they were at the close of the single-core era, which is why players with 4k monitors need to see therapists after long play sessions.
Surely CPUs have gotten faster since the mid 2000s! They have, but not quite in the way that a single-threaded(single core) CPU renderer would expect... CPU manufacturers have been focusing on power draw, temperatures, core count, and special architectural improvements like GPU integration and controller integration. Comparatively, improving individual core speed hasn't been as much of a focus as it had been prior to the multi-core era -and no, I'm not talking about the useless gigahertz(TM) meme measurement, I'm talking about actual overall work done by the core. As a result, the CPUs we have today have developed down a much different path than what this CPU renderer would benefit from. Not nearly the amount that resizable mode demands. Especially considering these CPU cores were designed to assume that things didn't pile all their work onto just one core.
We're throwing over 21x the work at CPUs that, in most cases, have only been getting 5-15% faster per-core performance every year.

What is a "frame"?

Think of a frame as a painting. Your GPU renderer (or CPU cough cough) is responsible for using your GPU to paint an empty canvas, and turn it into a beautiful and complete picture. First, it draws the skybox(if there is one, it's gonna just fill with black in the case of OSRS). Then, it draws all the visible geometry from back to front, with all the lighting and effects. Then, it draws the GUI elements over the top. It does everything, one pixel at a time. Its job is to draw these paintings as quickly as possible (ideally, so you perceive movement) and present them to your monitor, one at a time, forever... until you close the game. Think of a GPU renderer as a talented artist with hundreds of arms (GPU cores).
If your GPU is able to paint this picture in 16.6 milliseconds (frame time measurements are always in milliseconds), then you'll have a frame rate of 60 frames per second, as 1000 ms / 16.6 is 60. Sometimes your renderer struggles, though. Sometimes it can only complete a frame in 100 milliseconds (10FPS). You can't wave a magic want when this happens. If you want a higher framerate, you need to either update your hardware, or change your software. By change software, I mean either make it more efficient at the work it's told to do, or give it less work. RuneLite has done the former. An example of the latter would be lowering resolution, turning graphical details down, turning off filtering, etc. Games usually call this set of controls the "Graphics settings". Luckily, OSRS is so lightweight it will likely never need a graphics settings menu.
(Think of a CPU renderer as a painter with no artistic ability and, in the case of quad core, four arms...but he's only allowed to paint with one, while the other 3 sit idle. Also, he has to constantly stop painting to return to his normal duties! No fun! The CPU is better off at its own desk, letting the GPU handle the painting.)

A GPU renderer improves frame rates

Not that this matters currently, as the game is capped at 50FPS anyways... but it's still going to be huge for low-end systems or high-end systems with high res monitors. There's also the future, though... Once a GPU renderer is out, it could be possible that they could someday uncap the framerate (which, according to mod atlas, is only the character's camera as all animations are 2FPS anyways).
I expect that an update like this will make fixed mode a solid 50FPS on literally everything capable of executing the game. Fixed mode was already easy to run on everything except for old netbooks and Windows Vista desktops, so this really wouldn't be a surprise.

A GPU renderer improves frame times

Frame times are just as important as frame rates. Your frame rate is how many frames are drawn over the course of a second. But, as described previously, each "painting" is done individually. Sometimes the painter takes longer to do something! What if there's a glowing projectile flying past the camera, or something else momentary that's intensive? The painter has to take the time to paint that, resulting in a handful of frames over the course of that second taking much more time than the others. When your frame rate is high and frame times are consistent, this is perceived as incredibly smooth motion.
Ideally, all of our frames are completed in the same amount of time, but this isn't the case. Sometimes "distractions" will come up, and cause the painter to devote an extra 10-20ms to it before returning to the rest of the painting. In bad scenarios, this actually becomes visible, and is referred to as micro stutter. Having a dedicated GPU renderer doing the work ensures this is very uncommon. A GPU has hundreds or thousands of cores. If some get distracted, others reach out and pick up the workload. Everything is smooth, distributed, and uninterrupted.
You may recall Mod Atlas talking about frame times when he posted about his GPU renderer last year: https://twitter.com/JagexAtlas/status/868131325114552321
Notice the part where he says it takes 25+ms on the CPU, but only takes 4-5ms on the GPU! That's 200-250 frames per second, if the framerate were uncapped! Also, side note: Just because a frame is completed in 1ms doesn't always mean your framerate will be 1000FPS. If your framerate is capped, then the painter will sit and wait after completing and presenting a frame until it's time to start painting again. This is why capping your framerate can be good for power usage, as demonstrated on mobile! Your GPU can't suck up your battery if it's asleep 90% of the time!

A GPU renderer is more efficient

Instead of piling all computational workloads and graphical workloads onto one single CPU core (rest in peace 8+ core users), a GPU renderer takes graphical work off the CPU and does it itself. I'd estimate the majority of all the work was graphical, so this will make a pretty noticeable difference in performance, especially on older systems. Before, having OSRS open while using other software would have a noticeable performance impact on everything. Especially on older computers. Not anymore! CPUs will run cooler, software will run better, and your computer may even use less power overall, since GPUs are much better at efficient graphical work than CPUs are!

All computers are already equipped to run this very VERY well

Most of the computers we have today are designed with two things: a good GPU, and an okay CPU. This isn't 2003 anymore. GPUs have made their way into everything, and they're prioritized over CPUs. They're not used just for games anymore, entire operating systems rely on them not just for animations and graphical effects, but entire computing tasks. GPUs are responsible for everything from facial recognition to Bitcoin mining these days. Not having a good one in your computer will leave you with a pretty frustrating experience - which is why every manufacturer makes sure you have one. Now, thanks to RuneLite, these will no longer be sitting idle while your poor CPU burns itself alive.

This new GPU renderer will make OSRS run much better on low end systems

Low end systems are notorious for having garbage like Intel Atom or Celeron in them. Their GPU is alright, but the CPU is absolutely terrible. Using the GPU will give them a boost from 5-15FPS in fixed mode, to around 50. At least, assuming they were made after the GPGPU revolution around 2010.

This new GPU renderer will make OSRS run much better on high end systems

High end systems tend to have huge GPUs and huge monitors. Right now, your GPU is asleep while your 4k monitor brings the current CPU renderer to its knees, on the verge of committing sudoku. Letting your GPU take on all that work will make your big and beautiful monitor handle OSRS without lag or stutter.

This new GPU renderer will open the possibility of plugins that build on top of it

One that comes to mind is a 2x/3x/4x GUI scaler. Scaling things in a graphics API is much easier than scaling it in some convoluded custom CPU renderer that was first designed to run in Internet Explorer 5.

It's easier to customize graphical variables in a GPU renderer than it is a glitchy old CPU renderer

Want night time? Change the light intensity. Want cel-shaded comic book appearance for some stupid reason? It's easy. Want to hit 60FPS on a Raspberry Pi? Change your render distance to 2 tiles. Now that the graphical work has been offloaded to a graphics API that's been literally designed to easily modify these things, the sky is the limit. See my past posts on this topic:
Big round of applause for the RuneLite team, and Jagex for allowing them to continue development. Without RuneLite, OSRS would be half the game it is today. Here's to their continued success, with or without Jagex integrating their code into the main game!
submitted by Tizaki to 2007scape [link] [comments]

First Build (for the new future)

The PC I've bought some years ago is starting to act up and with all this quarantine going on I thought I'd invest in a good PC build.
The reason why I want a PC is that I'll probably spend more time in VR in the future and I want my PC to handle it just fine.
I'll probably use my machine for more things that I can think of right now but at this moment I want it to be capable of:
Things I'd see as a big bonus but probably not a must-have as of now:
As everyone and their dog I'm also a programmer but afaik any PC can cover the requirements for any of those operations (WebAssembly anyone?)
I don't need a lot of storage for personal files (hence, only 500GB SSD, maybe even go down to 250GB).
I have no idea about the case nor the power supply. I basically just want to have enough space for good cable management and for all components and airflow. I'd prefer the case to white but in the end it doesn't matter. Power supply is a bit over the recommended 450W so that upgrades won't require a new one.
Lastly I'd like my machine to be really quiet. That probably means more fans with special abilities, right?
An important part for me is upgradability. That's why I want a good CPU+motherboard combo so that I hopefully only have to upgrade the GPU and RAM after a few years.

This is my build although I'd expect the video card and case to be an overkill, right?
PCPartPicker Part List
Type Item Price
CPU Intel Core i7-9700K 3.6 GHz 8-Core Processor $379.99 @ B&H
CPU Cooler be quiet! Dark Rock Pro 4 50.5 CFM CPU Cooler $89.90 @ B&H
Motherboard Asus ROG STRIX Z390-F GAMING ATX LGA1151 Motherboard $247.24 @ Amazon
Memory Corsair Vengeance LPX 16 GB (2 x 8 GB) DDR4-3000 Memory $84.99 @ Amazon
Storage Samsung 970 Evo 500 GB M.2-2280 NVME Solid State Drive $99.99 @ Amazon
Video Card NVIDIA TITAN RTX 24 GB Video Card $2489.98 @ Amazon
Case Fractal Design Meshify C ATX Mid Tower Case $99.98 @ Newegg
Power Supply Cooler Master MWE Gold 650 W 80+ Gold Certified Fully Modular ATX Power Supply $99.99 @ Best Buy
Prices include shipping, taxes, rebates, and discounts
Total $3592.06
Generated by PCPartPicker 2020-04-23 11:45 EDT-0400
submitted by noctmod to buildapc [link] [comments]

Reality Distortion Field

Ok fantards, I'm sure your egos are way too fragile to actually break from the fanboy narrative here but we are definitely going lower. But, look at all the upgrades! Yeah, and we're going lower, I figure $25.00 is a nice re-entry point. There are a lot of reason for this, but look at the macro. This is the top for now, not just in AMD but also the general market. Everyone is on pins and needles waiting for another rate cut. Why? Because the economy is propped to a ridiculous level. No rate cut? It crashes. Rate cut without promise of future cuts? It crashes. Fed injecting massive amounts of liquidity? Check. Manufacturing jobs disappearing? Check. All time high after all time high? Check. The global economy not doing so hot and we're ignoring it completely? Check. Smart money is exiting and securing their short positions as we speak. They're buoying the market just long enough to set themselves up for the retracement. I was around for 2008. Everything was wonderful, totally rock solid until it wasn't. And I'll admit that we aren't looking at another 2008 but we are looking at a helluva correction. Ironically, AMD will be ok but not until next year.
On that front, what do we have? Highest revs since 2007. Yay. See the irony? Rollout is too slow, PE is too high (yes it matters, especially when you're dealing with a manufacturer), too much hype around the CEO and too many new shares being dumped into the market. Yes, they are diluting, look at the numbers. I remember when they sought authorization for share issuance and all the usual fantards here started dumb-shaming anybody on here who dared question the notion that they might dilute. They were doing it "just so that they have the option of doing it" or some kind of nonsense like that was whet they said with indignant sanctimony. Well, they have been diluting and still are. It's not that bad compared to what it could be but it's still there and it affects the share price. Intel has been buying back shares and so has Nvidia.
https://ycharts.com/companies/AMD/stock_buyback
But don't worry, help is on the way. I think that they actually guided too conservatively for next quarter actually. I think that in the end AMD's superior tech will win decisive battles for market share. And, even though most of the "very smart" people on here throw a tantrum whenever I mention this: Crypto will be resurgent and AMD will benefit directly from it. The Ethereum mining hardware pool is diverse, they don't want just ASICs for a host of reasons. You can dismiss this out of hand at the behest of your own arrogance but it's the truth. it will make a difference in the bottom line. You're looking at massive crypto gains in 2020. I'm not going to explain why because he does it a lot better:
https://www.tradingview.comfilbfilb/
I picked up GBTC when Bitcoin was at 7500 and just sold it at 9400. I'm waiting for 8250 range to re-enter. But when it blows up it will take GPU mine-able alts with it. And if you think that these miners aren't already anticipating this and aren't accumulating cards right now to avoid profiteering when Bitcoin breaks to the upside then you're delusional. Why do you think Radeons are "Selling like hot cakes"? The tunnel vision here is amazing. Whatever the mainstream narrative is you guys eat it up. Stop being such a bunch of fanboys.
submitted by rantus to AMD_Stock [link] [comments]

Selling off my bitcoin mining rig

Selling off my bitcoin miner rigs. Decided I can’t afford it with electricity being .45/kw here.



Im selling



2 asus B250 mining expert motherboards. 1 new one used.

1 used H110 pro btc motherboard (used)

******note I had an issue with this motherboard so it may be bad. POSSIBLY NOT but might be. Take this into consideration when bidding.***************

1 asus 990fx sabertooth motherboard with 32gb ram.(see image) has a cpu included however I don’t remember which one it was. At the time it was for a gaming PC so it should be decent.

2 8 GPU all aluminum open air frames

2 6 GPU all aluminum open air frames

2 intel g3930 2.9ghz cpus. 1 new 1 used

14 120mm black fans including 2 fan hubs 1 for each machine

20 PCI risers (multiple manufacturers)





I have it listed for 450 OBO. GOOD DEAL I THINK. Message me for questions

https://www.ebay.com/itm/133302526663
submitted by Heiridum to BitcoinMining [link] [comments]

Y'all helped me last time, and I'm now looking to upgrade. Help me build a new rig?

Tl;Dr: I have an i3, 8 gb of ram, and a GTX 960--help me build a new pc so i can appropriately game again?

A few years back, I got some help building my current rig--but I went TOO budget and need to upgrade.
That build (which I am currently posting from) is here: https://pcpartpicker.com/useG1ng3rBr3dd/saved/#view=bGybt6
Specs:
- Intel i3-6100 CPU
- GeForce GTX 960 GPU
- G-Skill Single Slot 8gb RAM
- Corsair CX 500 PSU
- Asus VX238H-W 23.0" 1920x1080 Monitor (I think I need a higher fps monitor from what I've been told)
- Cooler Master N200 MicroATX Mini Tower Case
- Gigabyte GA-H110M-A Micro ATX LGA1151 Motherboard
Gaming has been increasingly difficult and I want to actually enjoy gaming again without falling victim to being the lowest frame rate/highest latency on every server I enter.
I gamed on my buddy's pc the other day and I was floored. I'm unsure what his specs are, but it made me realize that mine is just holding me back.
I don't need a ProGaming+Streaming+Bitcoin Farming BEAST of a PC. I just want to enjoy gaming again and be able to for a few years with minimal upgrades.
I really like PCpartpicker.com as I'm ignorant and it's highly user-friendly so if you are bored and like doing this stuff, I'd really appreciate the help picking the best parts for what I'm looking for at a price I can justify to myself.
My top-tier budget is around $1200 but that may be insanely high or insanely low for what I'm asking; I'm not really sure. If I can keep and use some of the parts I already have (Tower case, motherboard [maybe], PSU[?]) That'd be awesome but I understand If I can't.
Thank you for reading and thank you in advance if you decide to venture into this for me. I appreciate y'all.
submitted by G1ng3rBr3dd to PcMasterRaceBuilds [link] [comments]

GPU Mining Crash Course - START HERE!

Welcome All to the GPUMining Crash Course!
With the increase in prices in cryptocurrency, a lot of people are getting back into mining and a lot of people are brand new to the concept overall. So, I quickly wrote this crash course to help you understand what to expect and how to successfully mine your first cryptocurrency. This crash course isn't gonna have all of the fluff you'd see in a normal publication. This is just everything you need to know to get up and running on your first cryptocurrency mining rig.

What is cryptocurrency mining?

One of the main things about cryptocurrencies is that they are "decentralized". Sounds great, but WTF does that even mean? Well, the easiest way to explain it is...
You know how if you want to send your friend/family money digitally, you can do so through your bank. Your bank likely takes a transaction fee and in a few days they will transfer the money. Since cryptocurrencies are decentralized, they don't have a bank or organization to fulfill the transfer of money. Instead, they outsource the computing power of their cryptocurrency network to miners (soon to be you). These miners are verifying transactions, securing the blockchain, and powering the cryptocurrency's specific network among other things. As an incentive, the miners collect transaction fees on the transactions that they verify and collect block rewards while new currency is still being introduced into the ecosystem.

What kind of rig should I build?

You can mine cryptocurrencies using your CPU, GPU, FPGA, or ASIC, but this is a GPU Mining subreddit, so I will cater this to GPUs.
For building a great all-around GPU rig, there are two models of GPUs that I'd recommend:
Both of these GPUs have solid hashrates across most mining algorithms and for a decent price! You should be able to find both of these kinds of GPUs used for around $200-$250 each, which is a great price if you know what happened during the last mining craze! ($200 GPUs were out of stock everywhere and people were reselling them for $600+ each)
There are also plenty of great AMD GPUs for mining, but I've worked mostly with Nvidia so that's why both of my recommendations are Nvidia and not AMD.
Other parts to your rig that you'll need are listed below. Most of these can be pieces of crap and are just needed to make the rig actually run, but the one spot you DON'T want to cheap out on is the power supply unit. A decent power supply unit will keep your home from burning down while also keeping your rigs up and running smoothly. Here are my recommendations:

She's built, now what?

Now you need to do a few things. I am a Windows miner, so I will be speaking to Windows here:
  1. Update Windows - Do all of the updates. Just do it.
  2. Update Drivers - Go to the EVGA website and download GeForce experience. It will keep your GPU drivers up to date.
  3. Go to Windows Device Manager and make sure all of your GPUs show up under "Display Adapters". If it is there, but it isn't showing the Name/Model of the GPU as the name, right click it and select "Update Driver". This should fix it.
Assuming you've done all of this, you're ready to download a mining application.

Mining Software

There are tons to choose from! Claymore, Phoenix, EWBF, LolMiner, etc... It can be overwhelming pretty quickly since they all have different algorithm support, speeds, efficiencies, and a whole lot more. On top of that, in order to get them running you need to set up batch files to call the proper exe, point you to the correct pool, and a whole bunch of other stuff that can be confusing to a new user. Not to mention, you will probably need a separate miner, config file, batch file, etc. for each different algorithm that you're interested in mining on.
Instead, I recommend that you download a miner management software that will take care of most of this tedious work for you. There are a few in the sidebar, but the /GPUMining favorite is AIOMiner. It was developed by our very own community member, xixspiderxix with the intention of making mining as easy as possible to do and without any fees. It supports over 100 different algorithms, so you'll be able to mine nearly ANY cryptocurrency you'd like. Just download it from their website and it will take you through a quick tutorial to help you get set up! You can also connect your rig to their website for remote monitoring and control. You've probably seen a few of their posts around this subreddit.
Other Windows mining softwares include:
Note: Many mining softwares have fees built into them. Most are around 1%, but can go as high as 5% or greater! You want a mining software with little or no fees at all so that you get to keep as much cryptocurrency as possible. These fees aren't something you actively pay, the software will automatically take it by mining on the developers behalf for a given amount of time and then switching back to mining on your own behalf. So, please be diligent in the software that you evaluate and make sure it is reputable.

I keep hearing about NiceHash. What is that?

The asshole of the mining industry. Jk, but not really.
NiceHash is a software program that allows you to sell your rig's hashing power to someone on their marketplace. They market themselves as profitable mining, but you're not really mining. You're selling your power in exchange for Bitcoin.
They did a great job telling people that with them, you're always mining the most profitable coin, but that's just not true. Since it is a mining marketplace, they make you mine whatever their most expensive contract is. If their contracts are below market prices, then you're not operating as efficiently and profitably as you could be.
NiceHash also has a sketchy history, which continues to this day. In 2017, they were hacked and lost $65M worth of Bitcoin. No one got paid out for MONTHS and many of their executives conveniently resigned. Their platform is also used to destroy cryptocurrencies. Since people are able to purchase mining power on their platform, people have used their platform to purchase enough mining power to control individual cryptocurrencies and duplicate coins, which increased the malicious user's wealth while completely destroying the integrity of the coin's blockchain. HoriZEN (formerly ZenCash), Ethereum Classic, and many other great cryptocurrencies have been the victim of NiceHash's platform.
For this and many other reasons, we highly recommend that you stay AWAY from Nicehash. We understand that it is extremely easy to use and you get paid in bitcoin, but they are destroying the industry with their greed and lack of motivation to change their platform for the protection of cryptocurrencies.

Concluding Thoughts

This is pretty much everything you need to know to get started. We covered the hardware, setting up the software, which software to use, and AIOMiner's tutorial will get you up to speed on how to actually mine the cryptocurrency that you want better than I can explain it, so I'll leave that part to them.
If you have any questions on this crash course, please leave a comment below where myself and other community members will be able to help you out.
submitted by The_Brutally_Honest to gpumining [link] [comments]

Transcript of discussion between an ASIC designer and several proof-of-work designers from #monero-pow channel on Freenode this morning

[08:07:01] lukminer contains precompiled cn/r math sequences for some blocks: https://lukminer.org/2019/03/09/oh-kay-v4r-here-we-come/
[08:07:11] try that with RandomX :P
[08:09:00] tevador: are you ready for some RandomX feedback? it looks like the CNv4 is slowly stabilizing, hashrate comes down...
[08:09:07] how does it even make sense to precompile it?
[08:09:14] mine 1% faster for 2 minutes?
[08:09:35] naturally we think the entire asic-resistance strategy is doomed to fail :) but that's a high-level thing, who knows. people may think it's great.
[08:09:49] about RandomX: looks like the cache size was chosen to make it GPU-hard
[08:09:56] looking forward to more docs
[08:11:38] after initial skimming, I would think it's possible to make a 10x asic for RandomX. But at least for us, we will only make an ASIC if there is not a total ASIC hostility there in the first place. That's better for the secret miners then.
[08:13:12] What I propose is this: we are working on an Ethash ASIC right now, and once we have that working, we would invite tevador or whoever wants to come to HK/Shenzhen and we walk you guys through how we would make a RandomX ASIC. You can then process this input in any way you like. Something like that.
[08:13:49] unless asics (or other accelerators) re-emerge on XMR faster than expected, it looks like there is a little bit of time before RandomX rollout
[08:14:22] 10x in what measure? $/hash or watt/hash?
[08:14:46] watt/hash
[08:15:19] so you can make 10 times more efficient double precisio FPU?
[08:16:02] like I said let's try to be productive. You are having me here, let's work together!
[08:16:15] continue with RandomX, publish more docs. that's always helpful.
[08:16:37] I'm trying to understand how it's possible at all. Why AMD/Intel are so inefficient at running FP calculations?
[08:18:05] midipoet ([email protected]/web/irccloud.com/x-vszshqqxwybvtsjm) has joined #monero-pow
[08:18:17] hardware development works the other way round. We start with 1) math then 2) optimization priority 3) hw/sw boundary 4) IP selection 5) physical implementation
[08:22:32] This still doesn't explain at which point you get 10x
[08:23:07] Weren't you the ones claiming "We can accelerate ProgPoW by a factor of 3x to 8x." ? I find it hard to believe too.
[08:30:20] sure
[08:30:26] so my idea: first we finish our current chip
[08:30:35] from simulation to silicon :)
[08:30:40] we love this stuff... we do it anyway
[08:30:59] now we have a communication channel, and we don't call each other names immediately anymore: big progress!
[08:31:06] you know, we russians have a saying "it was smooth on paper, but they forgot about ravines"
[08:31:12] So I need a bit more details
[08:31:16] ha ha. good!
[08:31:31] that's why I want to avoid to just make claims
[08:31:34] let's work
[08:31:40] RandomX comes in Sep/Oct, right?
[08:31:45] Maybe
[08:32:20] We need to audit it first
[08:32:31] ok
[08:32:59] we don't make chips to prove sw devs that their assumptions about hardware are wrong. especially not if these guys then promptly hardfork and move to the next wrong assumption :)
[08:33:10] from the outside, this only means that hw & sw are devaluing each other
[08:33:24] neither of us should do this
[08:33:47] we are making chips that can hopefully accelerate more crypto ops in the future
[08:33:52] signing, verifying, proving, etc.
[08:34:02] PoW is just a feature like others
[08:34:18] sech1: is it easy for you to come to Hong Kong? (visa-wise)
[08:34:20] or difficult?
[08:34:33] or are you there sometimes?
[08:34:41] It's kind of far away
[08:35:13] we are looking forward to more RandomX docs. that's the first step.
[08:35:31] I want to avoid that we have some meme "Linzhi says they can accelerate XYZ by factor x" .... "ha ha ha"
[08:35:37] right? we don't want that :)
[08:35:39] doc is almost finished
[08:35:40] What docs do you need? It's described pretty good
[08:35:41] so I better say nothing now
[08:35:50] we focus on our Ethash chip
[08:36:05] then based on that, we are happy to walk interested people through the design and what else it can do
[08:36:22] that's a better approach from my view than making claims that are laughed away (rightfully so, because no silicon...)
[08:36:37] ethash ASIC is basically a glorified memory controller
[08:36:39] sech1: tevador said something more is coming (he just did it again)
[08:37:03] yes, some parts of RandomX are not described well
[08:37:10] like dataset access logic
[08:37:37] RandomX looks like progpow for CPU
[08:37:54] yes
[08:38:03] it is designed to reflect CPU
[08:38:34] so any ASIC for it = CPU in essence
[08:39:04] of course there are still some things in regular CPU that can be thrown away for RandomX
[08:40:20] uncore parts are not used, but those will use very little power
[08:40:37] except for memory controller
[08:41:09] I'm just surprised sometimes, ok? let me ask: have you designed or taped out an asic before? isn't it risky to make assumptions about things that are largely unknown?
[08:41:23] I would worry
[08:41:31] that I get something wrong...
[08:41:44] but I also worry like crazy that CNv4 will blow up, where you guys seem to be relaxed
[08:42:06] I didn't want to bring up anything RandomX because CNv4 is such a nailbiter... :)
[08:42:15] how do you guys know you don't have asics in a week or two?
[08:42:38] we don't have experience with ASIC design, but RandomX is simply designed to exactly fit CPU capabilities, which is the best you can do anyways
[08:43:09] similar as ProgPoW did with GPUs
[08:43:14] some people say they want to do asic-resistance only until the vast majority of coins has been issued
[08:43:21] that's at least reasonable
[08:43:43] yeah but progpow totally will not work as advertised :)
[08:44:08] yeah, I've seen that comment about progpow a few times already
[08:44:11] which is no surprise if you know it's just a random sales story to sell a few more GPUs
[08:44:13] RandomX is not permanent, we are expecting to switch to ASIC friendly in a few years if possible
[08:44:18] yes
[08:44:21] that makes sense
[08:44:40] linzhi-sonia: how so? will it break or will it be asic-able with decent performance gains?
[08:44:41] are you happy with CNv4 so far?
[08:45:10] ah, long story. progpow is a masterpiece of deception, let's not get into it here.
[08:45:21] if you know chip marketing it makes more sense
[08:45:24] linzhi-sonia: So far? lol! a bit early to tell, don't you think?
[08:45:35] the diff is coming down
[08:45:41] first few hours looked scary
[08:45:43] I remain skeptical: I only see ASICs being reasonable if they are already as ubiquitous as smartphones
[08:45:46] yes, so far so good
[08:46:01] we kbew the diff would not come down ubtil affter block 75
[08:46:10] yes
[08:46:22] but first few hours it looks like only 5% hashrate left
[08:46:27] looked
[08:46:29] now it's better
[08:46:51] the next worry is: when will "unexplainable" hashrate come back?
[08:47:00] you hope 2-3 months? more?
[08:47:05] so give it another couple of days. will probably overshoot to the downside, and then rise a bit as miners get updated and return
[08:47:22] 3 months minimum turnaround, yes
[08:47:28] nah
[08:47:36] don't underestimate asicmakers :)
[08:47:54] you guys don't get #1 priority on chip fabs
[08:47:56] 3 months = 90 days. do you know what is happening in those 90 days exactly? I'm pretty sure you don't. same thing as before.
[08:48:13] we don't do any secret chips btw
[08:48:21] 3 months assumes they had a complete design ready to go, and added the last minute change in 1 day
[08:48:24] do you know who is behind the hashrate that is now bricked?
[08:48:27] innosilicon?
[08:48:34] hyc: no no, and no. :)
[08:48:44] hyc: have you designed or taped out a chip before?
[08:48:51] yes, many years ago
[08:49:10] then you should know that 90 days is not a fixed number
[08:49:35] sure, but like I said, other makers have greater demand
[08:49:35] especially not if you can prepare, if you just have to modify something, or you have more programmability in the chip than some people assume
[08:50:07] we are chipmakers, we would never dare to do what you guys are doing with CNv4 :) but maybe that just means you are cooler!
[08:50:07] and yes, programmability makes some aspect of turnaround easier
[08:50:10] all fine
[08:50:10] I hope it works!
[08:50:28] do you know who is behind the hashrate that is now bricked?
[08:50:29] inno?
[08:50:41] we suspect so, but have no evidence
[08:50:44] maybe we can try to find them, but we cannot spend too much time on this
[08:50:53] it's probably not so much of a secret
[08:51:01] why should it be, right?
[08:51:10] devs want this cat-and-mouse game? devs get it...
[08:51:35] there was one leak saying it's innosilicon
[08:51:36] so you think 3 months, ok
[08:51:43] inno is cool
[08:51:46] good team
[08:51:49] IP design house
[08:51:54] in Wuhan
[08:52:06] they send their people to conferences with fake biz cards :)
[08:52:19] pretending to be other companies?
[08:52:26] sure
[08:52:28] ha ha
[08:52:39] so when we see them, we look at whatever card they carry and laugh :)
[08:52:52] they are perfectly suited for secret mining games
[08:52:59] they made at most $6 million in 2 months of mining, so I wonder if it was worth it
[08:53:10] yeah. no way to know
[08:53:15] but it's good that you calculate!
[08:53:24] this is all about cost/benefit
[08:53:25] then you also understand - imagine the value of XMR goes up 5x, 10x
[08:53:34] that whole "asic resistance" thing will come down like a house of cards
[08:53:41] I would imagine they sell immediately
[08:53:53] the investor may fully understand the risk
[08:53:57] the buyer
[08:54:13] it's not healthy, but that's another discussion
[08:54:23] so mid-June
[08:54:27] let's see
[08:54:49] I would be susprised if CNv4 ASICs show up at all
[08:54:56] surprised*
[08:54:56] why?
[08:55:05] is only an economic question
[08:55:12] yeah should be interesting. FPGAs will be near their limits as well
[08:55:16] unless XMR goes up a lot
[08:55:19] no, not *only*. it's also a technology question
[08:55:44] you believe CNv4 is "asic resistant"? which feature?
[08:55:53] it's not
[08:55:59] cnv4 = Rabdomx ?
[08:56:03] no
[08:56:07] cnv4=cryptinight/r
[08:56:11] ah
[08:56:18] CNv4 is the one we have now, I think
[08:56:21] since yesterday
[08:56:30] it's plenty enough resistant for current XMR price
[08:56:45] that may be, yes!
[08:56:55] I look at daily payouts. XMR = ca. 100k USD / day
[08:57:03] it can hold until October, but it's not asic resistant
[08:57:23] well, last 24h only 22,442 USD :)
[08:57:32] I think 80 h/s per watt ASICs are possible for CNv4
[08:57:38] linzhi-sonia where do you produce your chips? TSMC?
[08:57:44] I'm cruious how you would expect to build a randomX ASIC that outperforms ARM cores for efficiency, or Intel cores for raw speed
[08:57:48] curious
[08:58:01] yes, tsmc
[08:58:21] Our team did the world's first bitcoin asic, Avalon
[08:58:25] and upcoming 2nd gen Ryzens (64-core EPYC) will be a blast at RandomX
[08:58:28] designed and manufactured
[08:58:53] still being marketed?
[08:59:03] linzhi-sonia: do you understand what xmr wants to achieve, community-wise?
[08:59:14] Avalon? as part of Canaan Creative, yes I think so.
[08:59:25] there's not much interesting oing on in SHA256
[08:59:29] Inge-: I would think so, but please speak
[08:59:32] hyc: yes
[09:00:28] linzhi-sonia: i am curious to hear your thoughts. I am fairly new to this space myself...
[09:00:51] oh
[09:00:56] we are grandpas, and grandmas
[09:01:36] yet I have no problem understanding why ASICS are currently reviled.
[09:01:48] xmr's main differentiators to, let's say btc, are anonymity and fungibility
[09:01:58] I find the client terribly slow btw
[09:02:21] and I think the asic-forking since last may is wrong, doesn't create value and doesn't help with the project objectives
[09:02:25] which "the client" ?
[09:02:52] Monero GUI client maybe
[09:03:12] MacOS, yes
[09:03:28] What exactly is slow?
[09:03:30] linzhi-sonia: I run my own node, and use the CLI and Monerujo. Have not had issues.
[09:03:49] staying in sync
[09:03:49] linzhi-sonia: decentralization is also a key principle
[09:03:56] one that Bitcoin has failed to maintain
[09:04:39] hmm
[09:05:00] looks fairly decentralized to me. decentralization is the result of 3 goals imo: resilient, trustless, permissionless
[09:05:28] don't ask a hardware maker about physical decentralization. that's too ideological. we focus on logical decentralization.
[09:06:11] physical decentralization is important. with bulk of bitnoin mining centered on Chinese hydroelectric dams
[09:06:19] have you thought about including block data in the PoW?
[09:06:41] yes, of course.
[09:07:39] is that already in an algo?
[09:08:10] hyc: about "centered on chinese hydro" - what is your source? the best paper I know is this: https://coinshares.co.uk/wp-content/uploads/2018/11/Mining-Whitepaper-Final.pdf
[09:09:01] linzhi-sonia: do you mine on your ASICs before you sell them?
[09:09:13] besides testing of course
[09:09:45] that paper puts Chinese btc miners at 60% max
[09:10:05] tevador: I think everybody learned that that is not healthy long-term!
[09:10:16] because it gives the chipmaker a cost advantage over its own customers
[09:10:33] and cost advantage leads to centralization (physical and logical)
[09:10:51] you guys should know who finances progpow and why :)
[09:11:05] but let's not get into this, ha ha. want to keep the channel civilized. right OhGodAGirl ? :)
[09:11:34] tevador: so the answer is no! 100% and definitely no
[09:11:54] that "self-mining" disease was one of the problems we have now with asics, and their bad reputation (rightfully so)
[09:13:08] I plan to write a nice short 2-page paper or so on our chip design process. maybe it's interesting to some people here.
[09:13:15] basically the 5 steps I mentioned before, from math to physical
[09:13:32] linzhi-sonia: the paper you linked puts 48% of bitcoin mining in Sichuan. the total in China is much more than 60%
[09:13:38] need to run it by a few people to fix bugs, will post it here when published
[09:14:06] hyc: ok! I am just sharing the "best" document I know today. it definitely may be wrong and there may be a better one now.
[09:14:18] hyc: if you see some reports, please share
[09:14:51] hey I am really curious about this: where is a PoW algo that puts block data into the PoW?
[09:15:02] the previous paper I read is from here http://hackingdistributed.com/2018/01/15/decentralization-bitcoin-ethereum/
[09:15:38] hyc: you said that already exists? (block data in PoW)
[09:15:45] it would make verification harder
[09:15:49] linzhi-sonia: https://the-eye.eu/public/Books/campdivision.com/PDF/Computers%20General/Privacy/bitcoin/meh/hashimoto.pdf
[09:15:51] but for chips it would be interesting
[09:15:52] we discussed the possibility about a year ago https://www.reddit.com/Monero/comments/8bshrx/what_we_need_to_know_about_proof_of_work_pow/
[09:16:05] oh good links! thanks! need to read...
[09:16:06] I think that paper by dryja was original
[09:17:53] since we have a nice flow - second question I'm very curious about: has anyone thought about in-protocol rewards for other functions?
[09:18:55] we've discussed micropayments for wallets to use remote nodes
[09:18:55] you know there is a lot of work in other coins about STARK provers, zero-knowledge, etc. many of those things very compute intense, or need to be outsourced to a service (zether). For chipmakers, in-protocol rewards create an economic incentive to accelerate those things.
[09:19:50] whenever there is an in-protocol reward, you may get the power of ASICs doing something you actually want to happen
[09:19:52] it would be nice if there was some economic reward for running a fullnode, but no one has come up with much more than that afaik
[09:19:54] instead of fighting them off
[09:20:29] you need to use asics, not fight them. that's an obvious thing to say for an asicmaker...
[09:20:41] in-protocol rewards can be very powerful
[09:20:50] like I said before - unless the ASICs are so useful they're embedded in every smartphone, I dont see them being a positive for decentralization
[09:21:17] if they're a separate product, the average consumer is not going to buy them
[09:21:20] now I was talking about speedup of verifying, signing, proving, etc.
[09:21:23] they won't even know what they are
[09:22:07] if anybody wants to talk about or design in-protocol rewards, please come talk to us
[09:22:08] the average consumer also doesn't use general purpose hardware to secure blockchains either
[09:22:14] not just for PoW, in fact *NOT* for PoW
[09:22:32] it requires sw/hw co-design
[09:23:10] we are in long-term discussions/collaboration over this with Ethereum, Bitcoin Cash. just talk right now.
[09:23:16] this was recently published though suggesting more uptake though I guess https://btcmanager.com/college-students-are-the-second-biggest-miners-of-cryptocurrency/
[09:23:29] I find it pretty hard to believe their numbers
[09:24:03] well
[09:24:09] sorry, original article: https://www.pcmag.com/news/366952/college-kids-are-using-campus-electricity-to-mine-crypto
[09:24:11] just talk, no? rumors
[09:24:18] college students are already more educated than the average consumer
[09:24:29] we are not seeing many such customers anymore
[09:24:30] it's data from cisco monitoring network traffic
[09:24:33] and they're always looking for free money
[09:24:48] of course anyone with "free" electricity is inclined to do it
[09:24:57] but look at the rates, cannot make much money
[09:26:06] Ethereum is a bloated collection of bugs wrapped in a UI. I suppose they need all the help they can get
[09:26:29] Bitcoin Cash ... just another get rich quick scheme
[09:26:38] hmm :)
[09:26:51] I'll give it back to you, ok? ha ha. arrogance comes before the fall...
[09:27:17] maye we should have a little fun with CNv4 mining :)
[09:27:25] ;)
[09:27:38] come on. anyone who has watched their track record... $75M lost in ETH at DAO hack
[09:27:50] every smart contract that comes along is just waiting for another hack
[09:27:58] I just wanted to throw out the "in-protocol reward" thing, maybe someone sees the idea and wants to cowork. maybe not. maybe it's a stupid idea.
[09:29:18] linzhi-sonia: any thoughts on CN-GPU?
[09:29:55] CN-GPU has one positive aspect - it wastes chip area to implement all 18 hash algorithms
[09:30:19] you will always hear roughly the same feedback from me:
[09:30:52] "This algorithm very different, it heavy use floating point operations to hurt FPGAs and general purpose CPUs"
[09:30:56] the problem is, if it's profitable for people to buy ASIC miners and mine, it's always more profitable for the manufacturer to not sell and mine themselves
[09:31:02] "hurt"
[09:31:07] what is the point of this?
[09:31:15] it totally doesn't work
[09:31:24] you are hurting noone, just demonstrating lack of ability to think
[09:31:41] what is better: algo designed for chip, or chip designed for algo?
[09:31:43] fireice does it on daily basis, CN-GPU is a joke
[09:31:53] tevador: that's not really true, especially in a market with such large price fluctuations as cryptocurrency
[09:32:12] it's far less risky to sell miners than mine with them and pray that price doesn't crash for next six months
[09:32:14] I think it's great that crypto has a nice group of asicmakers now, hw & sw will cowork well
[09:32:36] jwinterm yes, that's why they premine them and sell after
[09:32:41] PoW is about being thermodynamically and cryptographically provable
[09:32:45] premining with them is taking on that risk
[09:32:49] not "fork when we think there are asics"
[09:32:51] business is about risk minimization
[09:32:54] that's just fear-driven
[09:33:05] Inge-: that's roughly the feedback
[09:33:24] I'm not saying it hasn't happened, but I think it's not so simple as saying "it always happens"
[09:34:00] jwinterm: it has certainly happened on BTC. and also on XMR.
[09:34:19] ironically, please think about it: these kinds of algos indeed prove the limits of the chips they were designed for. but they don't prove that you cannot implement the same algo differently! cannot!
[09:34:26] Risk minimization is not starting a business at all.
[09:34:34] proof-of-gpu-limit. proof-of-cpu-limit.
[09:34:37] imagine you have a money printing machine, would you sell it?
[09:34:39] proves nothing for an ASIC :)
[09:35:05] linzhi-sonia: thanks. I dont think anyone believes you can't make a more efficient cn-gpu asic than a gpu - but that it would not be orders of magnitude faster...
[09:35:24] ok
[09:35:44] like I say. these algos are, that's really ironic, designed to prove the limitatios of a particular chip in mind of the designer
[09:35:50] exactly the wrong way round :)
[09:36:16] like the cache size in RandomX :)
[09:36:18] beautiful
[09:36:29] someone looked at GPU designs
[09:37:31] linzhi-sonia can you elaborate? Cache size in RandomX was selected to fit CPU cache
[09:37:52] yes
[09:38:03] too large for GPU
[09:38:11] as I said, we are designing the algorithm to exactly fit CPU capabilities, I do not claim an ASIC cannot be more efficient
[09:38:16] ok!
[09:38:29] when will you do the audit?
[09:38:35] will the results be published in a document or so?
[09:38:37] I claim that single-chip ASIC is not viable, though
[09:39:06] you guys are brave, noone disputes that. 3 anti-asic hardforks now!
[09:39:18] 4th one coming
[09:39:31] 3 forks were done not only for this
[09:39:38] they had scheduled updates in the first place
[09:48:10] Monero is the #1 anti-asic fighter
[09:48:25] Monero is #1 for a lot of reasons ;)
[09:48:40] It's the coin with the most hycs.
[09:48:55] mooooo
[09:59:06] sneaky integer overflow, bug squished
[10:38:00] p0nziph0ne ([email protected]/vpn/privateinternetaccess/p0nziph0ne) has joined #monero-pow
[11:10:53] The convo here is wild
[11:12:29] it's like geo-politics at the intersection of software and hardware manufacturing for thermoeconomic value.
[11:13:05] ..and on a Sunday.
[11:15:43] midipoet: hw and sw should work together and stop silly games to devalue each other. to outsiders this is totally not attractive.
[11:16:07] I appreciate the positive energy here to try to listen, learn, understand.
[11:16:10] that's a start
[11:16:48] <-- p0nziph0ne ([email protected]/vpn/privateinternetaccess/p0nziph0ne) has quit (Quit: Leaving)
[11:16:54] we won't do silly mining against xmr "community" wishes, but not because we couldn'd do it, but because it's the wrong direction in the long run, for both sides
[11:18:57] linzhi-sonia: I agree to some extent. Though, in reality, there will always be divergence between social worlds. Not every body has the same vision of the future. Reaching societal consensus on reality tomorrow is not always easy
[11:20:25] absolutely. especially at a time when there is so much profit to be made from divisiveness.
[11:20:37] someone will want to make that profit, for sure
[11:24:32] Yes. Money distorts.
[11:24:47] Or wealth...one of the two
[11:26:35] Too much physical money will distort rays of light passing close to it indeed.
submitted by jwinterm to Monero [link] [comments]

Wondering if it’s worth it to upgrade my current PC or start over?

So I built my computer when bitcoin mining was still pretty popular and I paid more than I should have for some of the things in the build which meant I couldn’t get exactly what I wanted for the budget I had.
Now I am having issues with my CPU and Drive constantly running at 100% and my computer being slow in general sometimes.
I want to be able to run modern and upcoming AAA titles without huge issues. The thing is I’m not experienced enough to know what I need to replace or if I need to replace certain parts to replace that and so on and so forth.
So here’s my build:
OS - Windows 10
Motherboard - MSI B250M Bazooka (MS-7A70)
CPU/Processor - Intel Core i5 7500 @ 3.40ghz
GPU/Graphics Card - Zotac GeForce GTX 1050-Ti 4gb
Ram - Crucial Ballistix Sport DDR4 2400 C16 2x4gb
Hard Drive - WD Blue 1 TB
Case - DeepCool Tesseract Mid-Tower Case (Blue/Black)
Keyboard - Razer Cynosa Chroma
Mouse - Steel series Rival 300 Mouse
Mouse Pad - Rager Goliathus Chroma
Speakers/Subwoofer - Logitech Z313 2-1 channel 3 piece speaker system with subwoofer
Monitor: Lenovo LI2264D wide 21.5 inch
So if you guys could help me out and let me know what I could replace to make this bad boy run better and able to do what I want it to do or if you guys think I should just start over. Thanks a bunch
submitted by no312 to buildapc [link] [comments]

I literally have tens of thousands of dollars in top-shelf hardware, looking to repurpose some before selling on eBay to build a NAS system, possibly a dedicated firewall device as well. o_O

Q1) What will you be doing with this PC? Be as specific as possible, and include specific games or programs you will be using.**

A1) This will be a dedicated NAS system for my home network. As such, I'm looking to have it:

- Host ##TB's of 720, 1080 & up resolution Movies and TV Shows I'm about to begin ripping from a MASSIVE DVD & Blueray collection I have.

- My kids are big on Minecraft. I understand it's possible to host your own "worlds" (or whatever they call the maps you can build) on your own "server". I think it would be pretty neat to offer them (& their friends - if can be done 'safely/securely') their own partition on one of my NAS HDD's.

- I also have accounts with a couple diff VPN companies... I understand it's possible (?) to sync said VPN's with a NAS, this might be a more relative topic on the next point/purpose...

- I'd like to be able to remotely link to this NAS for when I travel overseas and want to stream at my temp location from my house/this NAS.
______________________
Q2) What is your maximum budget before rebates/shipping/taxes?**

* A2) Here's where I make matters more complicated than most others would... I've been an advocate for Bitcoin and crypto-currencies in general since 2013. I invested in a small mining outfit back in 2014 (strictly Bitcoin/ASIC's). One of my buddies is the President of a large-scale mining operation (foreign and domestic) and he convinced me to dabble in the GPU mining-space. I made my first hardware purchase in Q4, 2017 and launched a small-scale GPU-Farm in my house since then. I had the rigs mining up until Q3 of 2018 (not cost-efficient to keep on, especially living in SoFlo) and since then, the hardware's been collecting dust (& pissing off my family members since they lost access to 3X rooms in the house - I won't let anyone go near my gear). One of my New Years Resolutions for 2019 was to clear out the house of all my mining equipment so that's all about to go up on eBay. So "budget" is relative to whatever I "MUST" spend if I can't repurpose any of the parts I already have on hand for this build... (Anyone having something I "need" and is looking to barter for one of the items I'll list later on in here, LMK).
______________________
Q3) When do you plan on building/buying the PC? Note: beyond a week or two from today means any build you receive will be out of date when you want to buy.**

A3) IMMEDIATELY! :)
______________________
Q4) What, exactly, do you need included in the budget? (ToweOS/monitokeyboard/mouse/etc\)**

A4) Well I had a half-assed idea approximately 1 year ago that it might be wise to build a bunch of 'gaming rigs' to sell on eBay with my intended repurposed mining hardware so I went on a shopping spree for like 6 months. That said; I've got a plethora of various other components that aren't even unboxed yet. 90% of the items I've purchased for this additional project were items that were marked down via MIR (mail-in-rebates) & what-not...
AFAIK, there are only 3X items I absolutely do not have which I 'MUST' find. Those would be - 1) Motherboard which accepts "ECC RAM". 2) CPU for said MOBO. 3) Said "ECC RAM".\* 
______________________
Q5) Which country (and state/province) will you be purchasing the parts in? If you're in US, do you have access to a Microcenter location?**

A5) I'm located in Southwest Florida. No Microcenter's here. Best Buy is pretty much my only option although I am a member of Newegg, Amazon & Costco if that makes any difference?
______________________
Q6) If reusing any parts (including monitor(s)/keyboard/mouse/etc), what parts will you be reusing? Brands and models are appreciated.**

A6) In an attempt to better clean up this Q&A, I'm going to list the items I have on-hand at the end of this questionnaire in-case passers-by feel like this might be a TLDR.* (Scroll to the bottom & you'll see what I mean).
______________________
Q7) Will you be overclocking? If yes, are you interested in overclocking right away, or down the line? CPU and/or GPU?**

A7) I don't think that's necessary for my intended purpose although - I'm not against it if that helps & FWIW, I'm pretty skilled @ this task already (it's not rocket science).
______________________
Q8) Are there any specific features or items you want/need in the build? (ex: SSD, large amount of storage or a RAID setup, CUDA or OpenCL support, etc)**

A8) As stated in A4; ECC RAM is non-negotiable... RAID seems like a logical application here as well.

- This will predominantly be receiving commands from MacOS computers. I don't think that matters really but figured it couldn't hurt to let you guys know.\*

- I'd also be quite fond of implementing "PFSENSE" (or something of that caliber) applied to this system so I could give my Netgear Nighthawks less stress in that arena, plus my limited understanding of PFSENSE is that it's ability to act as a firewall runs circles around anything that comes with consumer-grade Wi-Fi routers (like my Nighthawks). Just the same, I'm open to building a second rig just for the firewall.\*

- Another desirable feature would be that it draws as little electricity from the wall as possible. (I'm EXTREMELY skilled in this arena. I have "Kill-A-Watts" to test/gauge on, as well as an intimate understanding of the differences between Silver, Gold, Platinum and Titanium rated PSU's. As well as having already measured each of the PSU's I have on-hand and taken note of the 'target TDP draw' ("Peak Power Efficiency Draw") each one offers when primed with X amount of GPU's when I used them for their original purpose.\*

- Last, but not least, sound (as in noise created from the rig). I'd like to prop this device up on my entertainment center in the living room. I've (almost) all of the top-shelf consumer grade products one could dream of regarding fans and other thermal-related artifacts.

- Almost forgot; this will be hosting to devices on the KODI platform (unless you guys have better alternative suggestions?)
______________________
Q9) Do you have any specific case preferences (Size like ITX/microATX/mid-towefull-tower, styles, colors, window or not, LED lighting, etc), or a particular color theme preference for the components?**

A9) Definitely! Desired theme would be WHITE. If that doesn't work for whatever reason, black or gray would suffice. Regarding "Case Size". Nah, that's not too important although I don't foresee a mini-ITX build making sense if I'm going to be cramming double digit amounts of TB in the system, Internal HDD's sounds better than a bunch of externals plugged in all the USB ports.
______________________
Q10) Do you need a copy of Windows included in the budget? If you do need one included, do you have a preference?**

A10) I don't know. If I do need a copy of Windows, I don't have one so that's something I'll have to consider I guess. I doubt that's a necessity though.
______________________
______________________
______________________
**Extra info or particulars:*\*

AND NOW TO THE FUN-STUFF... Here's a list of everything (PARTS PARTS PARTS) I have on-hand and ready to deploy into the wild &/or negotiate a trade/barter with:

CASES -
Corsair Carbide Series Air 540 Arctic White (Model# Crypto-Currency-9011048-WW) - (Probably my top pick for this build).
Cooler Master HAF XB EVO (This is probably my top 1st or 2nd pick for this build, the thing is a monster!).
Cooler Master Elite 130 - Mini ITX - Black
Cooler Master MasterBox 5 MID-Tower - Black & White
Raidmax Sigma-TWS - ATX - White
MasterBox Lite 5 - ATX - Black w/ diff. Colored accent attachments (included with purchase)
NZXT S340 Elite Matte White Steel/Tempered Glass Edition
EVGA DG-76 Alpine White - Mid Tower w/ window
EVGA DG-73 Black - Mid Tower w/ window (I have like 3 of these)

______________________
CPU's -
***7TH GEN OR BELOW INTEL's ("Code Name Class mentioned next to each one)**\*
Pentium G4400 (Skylake @54W TDP) - Intel ARK states is "ECC CAPABLE"
Celeron G3930 (Kaby Lake @ 51W TDP) - Intel ARK states is "ECC CAPABLE" :)
i5 6402P (Skylake @65W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(
i5 6600k (Skylake @ 91W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(
i7 6700 (Skylake @ 65W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(
i7 7700k (Kaby Lake @ 95W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(


***8TH GEN INTEL's **\*
i3-8350K (Coffee Lake @91W TDP) - Intel ARK states is "ECC FRIENDLY" :)
I5-8600K (Coffee Lake @95W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(


***AMD RYZEN's **\*
Ryzen 3 2200G
Ryzen 5 1600
Ryzen 7 1700X

______________________
MOTHERBOARDS -

***7TH GEN AND BELOW INTEL BASED MOBO'S - **\*
MSI Z170A-SLI
ASUS PRIME Z270-A
ASUS PRIME Z270-P
ASUS PRIME Z270-K
EVGA Z270 Stinger
GIGABYTE GA-Z270XP-SLI
MSI B150M ARCTIC
MSI B250M MICRO ATX (PRO OPT. BOOST EDITION)

***8TH GEN INTEL BASED MOBO'S - **\*
EVGA Z370 FTW
GIGABYTE Z370XP SLI (Rev. 1.0)
MSI Z370 SLI PLUS


***AMD RYZEN BASED MOBO'S - **\*
ASUS ROG STRIX B350-F GAMING
MSI B350 TOMAHAWK
MSI X370 GAMING PRO
ASROCK AB350M PRO4
______________________


RAM -

Way too many to list, nothing but 4 & 8GB DDR4 sticks and unfortunately, none are ECC so it's not even worth mentioning/listing these unless someone reading this is willing to barter. At which time I'd be obliged to send an itemized list or see if I have what they're/you're specifically looking for.\*
______________________
THERMAL APPLICATIONS/FANS -
JUST FANS -
BeQuiet -
Pure Wings 2 (80mm)
Pure Wings 2 (120mm)
Pure Wings 2 (140mm)
Silent Wings 3 PWM (120mm)

NOCTUA -
PoopBrown - NF-A20 PWM (200mm) Specifically for the BIG "CoolerMaster HAF XB EVO" Case
GREY - NF-P12 Redux - 1700RPM (120mm) PWM
Corsair -
Air Series AF120LED (120mm)

CPU COOLING SYSTEMS -
NOCTUA -
NT-HH 1.4ml Thermal Compound
NH-D15 6 Heatpipe system (this thing is the tits)

EVGA (Extremely crappy coding in the software here, I'm like 99.99% these will be problematic if I were to try and use in any OS outside of Windows, because they barely ever work in the intended Windows as it is).
CLC 240 (240mm Water-cooled system
CRYORIG -
Cryorig C7 Cu (Low-Profile Copper Edition*)

A few other oversized CPU cooling systems I forget off the top of my head but a CPU cooler is a CPU cooler after comparing to the previous 3 models I mentioned.
I almost exclusively am using these amazing "Innovation Cooling Graphite Thermal Pads" as an alternative to thermal paste for my CPU's. They're not cheap but they literally last forever.

NZXT - Sentry Mesh Fan Controller
______________________
POWER SUPPLIES (PSU's) -
BeQuiet 550W Straight Power 11 (GOLD)

EVGA -
750P2 (750W, Platinum)
850P2 (850W, Platinum)
750T2 (750W, TITANIUM - yeah baby, yeah)

ROSEWILL -
Quark 750W Platinum
Quark 650W Platinum

SEASONIC -
Focus 750W Platinum
______________________
STORAGE -
HGST Ultrastar 3TB - 64mb Cache - 7200RPM Sata III (3.5)
4X Samsung 860 EVO 500GB SSD's
2X Team Group L5 LITE 3D 2.5" SSD's 480GB
2X WD 10TB Essential EXT (I'm cool with shucking)
+ 6X various other external HDD's (from 4-8TB) - (Seagate, WD & G-Drives)
______________________

Other accessories worth mentioning -
PCI-E to 4X USB hub-adapter (I have a dozen or so of these - might not be sufficient enough &/or needed but again, 'worth mentioning' in case I somehow ever run out of SATA & USB ports and have extra external USB HDD's. Although, I'm sure there would be better suited components if I get to that point that probably won't cost all that much).
______________________
______________________
______________________
Needless to say, I have at least 1X of everything mentioned above. In most all cases, I have multiples of these items but obviously won't be needing 2X CPU's, Cases, etc...

Naturally, I have GPU's. Specifically;

At least 1X of every. Single. NVIDIA GTX 1070 TI (Yes, I have every variation of the 1070 ti made by MSI, EVGA and Zotac. The only brand I don't have is the Gigabyte line. My partners have terrible experience with those so I didn't even bother. I'm clearly not going to be needing a GPU for this build but again, I'm cool with discussing the idea of a barter if anyone reading this is in the market for one.

I also have some GTX 1080 TI's but those are already spoken for, sorry.

It's my understanding that select CPU's I have on this list are ECC Friendly and AFAIK, only 1 of my MOBO's claims to be ECC Friendly (The ASROCK AB350M PRO4), but for the life of me, I can't find any corresponding forums that confirm this and/or direct me to a listing where I can buy compatible RAM. Just the same, if I go w/ the ASROCK MOBO, that means I'd be using one of the Ryzens. Those are DEF. power hungry little buggers. Not a deal-breaker, just hoping to find something a little more conservative in terms of TDP.


In closing, I don't really need someone to hold my hand with the build part as much as figuring out which motherboard, CPU and RAM to get. Then I'm DEFINITELY going to need some guidance on what OS is best for my desired purpose. If building 2X Rigs makes sense, I'm totally open to that as well...
Rig 1 = EPIC NAS SYSTEM
Rig 2 = EPIC PFSENSE (or the like) DEDICATED FIREWALL

Oh, I almost forgot... The current routers I'm using are...
1X Netgear Nighthawk 6900P (Modem + Router)
1X Netgear Nighthawk X6S (AC 4000 I believe - Router dedicated towards my personal devices - no IoT &/or Guests allowed on this one)
1X TP-Link Archer C5 (Router). Total overkill after implementing the Nighthawks but this old beast somehow has the best range, plus it has 2X USB ports so for now, it's dedicated towards my IoT devices.
---- I also have a few other Wi-Fi routers (Apple Airport Extreme & some inferior Netgear's but I can only allocate so many WiFi Routers to so many WiFi channels w/out pissing off my neighbors) On that note, I have managed to convince my neighbors to let me in their house/WiFi configuration so we all have our hardware locked on specific, non-competing frequencies/channels so everyone's happy. :)


Please spare me the insults as I insulted myself throughout this entire venture. Part of why I did this was because when I was a kid, I used to fantasize about building a 'DREAM PC' but could never afford such. To compensate for this deficiency, I would actually print out the latest and greatest hardware components on a word document, print the lists up & tape to wall (for motivation). I was C++ certified at the age of 14 and built my first PC when I was 7. At the age of 15 I abandoned all hope in the sector and moved on to other aspirations. This entire ordeal was largely based off me finally fulfilling a childhood fantasy. On that note = mission accomplished. Now if I'm actually able to fulfill my desires on this post, I'm definitely going to feel less shitty about blowing so much money on all this stuff over the last couple years.

TIA for assisting in any way possible. Gotta love the internets!


THE END.
:)

EDIT/UPDATE (5 hours after OP) - My inbox is being inundated with various people asking for prices and other reasonable questions about my hardware being up for sale. Not to be redundant but rather to expound on my previous remarks about 'being interested in a bartetrade' with any of you here...

I did say I was going to sell my gear on eBay in the near future, I also said I wanted to trade/barter for anything relative to helping me accomplish my OP's mission(s). I'm not desperate for the $$$ but I'm also not one of those people that likes to rip other people off. That said; I value my time and money invested in this hardware and I'm only willing to unload it all once I've established I have ZERO need for any of it here in my home first. Hence my writing this lengthy thread in an attempt to repurpose at least a grand or two I've already spent.

One of the most commonly asked questions I anticipate receiving from interested bodies is going to be "How hard were you on your hardware?" Contrary to what anyone else would have probably done in my scenario which is say they were light on it whether they were or weren't, I documented my handling of the hardware, and have no problem sharing such documentation with verified, interested buyers (WHEN THE TIME COMES) to offer you guys peace of mind.

I have photo's and video's of the venture from A-Z. I am also obliged to provide (redacted) electricity bill statements where you can correlate my photo's (power draw on each rig), and also accurately deduct the excess power my house consumed with our other household appliances. Even taking into consideration how much (more) I spent in electricity from keeping my house at a constant, cool 70-72F year-round (via my Nest thermostat). Even without the rigs, I keep my AC @ 70 when I'm home and for the last 1.5-2 years, I just so happened to spend 85% of my time here at my house. When I would travel, I'd keep it at 72 for my wife & kids.
Additionally; I had each GPU 'custom' oveunderclocke'd (MSI Afterburner for all GPU's but the EVGA's).*
I doubt everyone reading this is aware so this is for those that don't.... EVGA had the brilliant idea of implementing what they call "ICX technology" in their latest NVIDIA GTX GPU's. The short(est) explanation of this "feature" goes as follows:

EVGA GPU's w/ "ICX 9 & above" have EXTRA HEAT/THERMAL SENSORS. Unlike every other GTX 1070 ti on the market, the one's with this feature actually have each of 2/2 on-board fans connected to individual thermal sensors. Which means - if you were to use the MSI Afterburner program on one of these EVGA's and create a custom fan curve for it, you'd only be able to get 1/2 of the fans to function the way intended. The other fan simply would not engage as the MSI Afterburner software wasn't designed/coded to recognize/ communicate with an added sensor (let alone sensor'S). This, in-turn, would likely result in whoever's using it the unintended way having a GPU defect on them within the first few months I'd imagine... Perhaps if they had the TDP power settings dumbed down as much as I did (60-63%), they might get a year or two out of it since it wouldn't run as near as hot, but I doubt any longer than that since cutting off 50% of the cooling system on one of these can't be ignored too long, surely capacitors would start to blow and who knows what else...
(Warning = RANT) Another interesting side-note about the EVGA's and their "Precision-X" OveUnderclocking software is that it's designed to only recognize 4X GPU's on a single system. For miners, that's just not cool. My favorite builds had 8X and for the motherboards that weren't capable of maintaining stable sessions on 8, I set up with 6X. Only my EVGA Rigs had 3 or 4X GPU's dedicated to a single motherboard. Furthermore, and as stated in an earlier paragraph, (& this is just my opinion) = EVGA SOFTWARE SUCKS! Precision X wasn't friendly with every motherboard/CPU I threw at it and their extension software for the CLC Close-Loop-Cooling/ CPU water-coolers simply didn't work on anything, even integrating into their own Precision-X software. The amount of time it took me to finally find compatible matches with that stuff was beyond maddening. (END RANT).
Which leads me to my other comments on the matter. That's what I had every single 1070 ti set at for TDP = 60-63%. Dropping the power load that much allowed me to bring down (on average) each 1070 ti to a constant 110-115W (mind you, this is only possible w/ "Titanium" rated PSU's, Platinum comes pretty damn close to the Titanium though) while mining Ethereum and was still able to maintain a bottom of 30 MH/s and a ceiling of 32 MH/s. Increasing the TDP to 80, 90, 100% or more only increased my hashrates (yields) negligibly, like 35-36 MH/s TOPS, which also meant each one was not only pulling 160-180W+ (Vs. the aforementioned 115'ish range), it also meant my rigs were creating a significantly greater amount of heat! Fortunately for the GPU's and my own personal habits, I live in South Florida where it's hot as balls typically, last winter was nothing like this one. Increasing my yields by 10-15% didn't justify increasing the heat production in my house by >30%, nor the added electricity costs from subjecting my AC handlers to that much of an extra work-load. For anyone reading this that doesn't know/understand what I'm talking about - after spending no less than 2-3 hours with each. and. every. one. I didn't play with the settings on just one and universally apply the settings to the rest. I found the 'prime' settings and documented them with a label-maker and notepad. Here's the math in a more transparent manner:

*** I NEVER LET MY GPU's BREACH 61C, EVER. Only my 8X GPU rigs saw 60-61 & it was the ones I had in the center of the build (naturally). I have REALLY high power fans (used on BTC ASIC MINERS) that were sucking air from those GPU's which was the only way I was able to obtain such stellar results while mining with them. **\*
Mining at "acceptable" heat temps (not acceptable to me, but most of the internet would disagree = 70C) and overclocking accordingly brings in X amount of yields per unit. =
'Tweaking' (underclocking) the GPU's to my parameters reduced my yield per unit from -10-15%, but it SAVED me well over 30-35% in direct electricity consumption, and an unknown amount of passive electricity consumption via creating approximately 20%+ less heat for my AC handler to combat.

I say all this extra stuff not just for anyone interested in mining with their GPU's, but really to answer (in-depth) the apparent questions you people are asking me in PM's. Something else that should help justify my claims of being so conservative should be the fact I only have/used "Platinum and Titanium" rated PSU's. Heat production, power efficiency and longevity of the hardware were ALWAYS my top priority.* . I truly thought Crypto would continue to gain and/or recover and bounce back faster than it did. If this project had maintained positive income for 12 months+, I'd have expanded one of our sites to also cater to GPU mining on a gnarly scale.

Once I have my NAS (& possibly 2nd rig for the firewall) successfully built, I'll be willing/able to entertain selling you guys some/all of the remaining hardware prior to launching on eBay. If there's something you're specifically looking for that I listed having, feel free to PM me with that/those specific item(s). Don't count on an immediate response but what you can count on is me honoring my word in offering whoever asks first right of refusal when the time comes for me to sell this stuff. Fortunately for me, PM's are time-stamped so that's how I'll gauge everyone's place in line. I hope this extra edit answers most of the questions you guys wanted to have answered and if not, sorry I guess. I'll do my best to bring light to anything I've missed out on after I realize whatever that error was/is. The only way anyone is getting first dibs on my hardware otherwise is if they either offer compelling insight into my original questions, or have something I need to trade w/.

THE END (Round#2)


submitted by Im-Ne-wHere to buildapcforme [link] [comments]

Looking to upgrade/redo current PC build

I built my PC a couple years ago when bitcoin mining was popular and paid more than I should have for the build I got. Now I’m looking to either start fresh and reuse parts that are viable in a new build or to see if it’s just possible to upgrade some things to make my PC run better. Right now it’s not great for modern AAA games and the like.
What will you be doing with this PC? Be as specific as possible, and include specific games or programs you will be using.
What is your maximum budget before rebates/shipping/taxes?
  • I would prefer to keep it around 800 but can go up to 1000 if absolutely needed
When do you plan on building/buying the PC? Note: beyond a week or two from today means any build you receive will be out of date when you want to buy.
  • Around Black Friday/Cyber Monday most likely
What, exactly, do you need included in the budget? (ToweOS/monitokeyboard/mouse/etc)
  • I don’t necessarily need a monitokeyboard/mouse as I already have those but they are barebones so if I can include them in my budget I’d be willing to upgrade
Which country (and state/province) will you be purchasing the parts in? If you're in US, do you have access to a Microcenter location?
  • Michigan, USA. Yes I have a micro center near me
If reusing any parts (including monitor(s)/keyboard/mouse/etc), what parts will you be reusing? Brands and models are appreciated.
  • The parts listed below are all the ones I have in my build currently. I’m not sure if it would be worth it upgrading separate parts or building a whole new one and reusing what I can, so I included a full part list.
  • OS - Windows 10
  • Motherboard - MSI B250M Bazooka (MS-7A70)
  • CPU/Processor - Intel Core i5 7500 @ 3.40ghz
  • GPU/Graphics Card - Zotac GeForce GTX 1050-Ti 4gb
  • Ram - Crucial Ballistix Sport DDR4 2400 C16 2x4gb
  • Hard Drive - WD Blue 1 TB
  • Case - DeepCool Tesseract Mid-Tower Case (Blue/Black)
  • Keyboard - Razer Cynosa Chroma
  • Mouse - Steel series Rival 300 Mouse
  • Mouse Pad - Rager Goliathus Chroma
  • Speakers/Subwoofer - Logitech Z313 2-1 channel 3 piece speaker system with subwoofer
  • Monitor: Lenovo LI2264D wide 21.5 inch
Will you be overclocking? If yes, are you interested in overclocking right away, or down the line? CPU and/or GPU?
  • Not 100% as I have no experience overclocking
Are there any specific features or items you want/need in the build? (ex: SSD, large amount of storage or a RAID setup, CUDA or OpenCL support, etc)
  • I would like an SSD but that’s all I can think of
Do you have any specific case preferences (Size like ITX/microATX/mid-towefull-tower, styles, colors, window or not, LED lighting, etc), or a particular color theme preference for the components?
  • I would prefer a mid tower size just because I don’t have all the space in the world for a huge tower. See through window and rgb/blue theme if possible
Do you need a copy of Windows included in the budget? If you do need one included, do you have a preference?
  • No
Extra info or particulars:
  • I think I covered it all, thank you to anyone who helps me out!
submitted by no312 to buildapcforme [link] [comments]

Andreas Antonopoulos gets "Satoshi's Vision" completely wrong and shows his misunderstanding of the system. He thinks 1 cpu 1 vote means 1 user 1 vote, a common mistake from people on the Core side.

In this video at the 6m20s mark Andreas Antonopoulos speaks about Satoshi's vision. He speaks about "1 cpu 1 vote" saying that Satoshi designed the system to be decentralized as possible, but Andreas completely misunderstands the meaning of 1 cpu 1 vote. He is falling into the common trap of conflating 1cpu 1 vote with 1 user 1 vote.
Andreas, haven't you even read nChains paper about POW and Theory of the Firm? A cpu is an economic resource:
One of the little-known aspects of bitcoin is the nature of the proof of work system. There are many people, especially those who support a UASF or PoW change that believe a distributed system should be completed as a mesh. In this, they confuse centralised systems with centrality. The truth of the matter, no matter which proof of work system is implemented, they all follow a maximal growth curve that reflects the nature of the firm as detailed in 1937 by Ronald Coase (1937).
The bitcoin White Paper was very specific. users of the system "vote with their CPU power" [1]. What this means, is that the system was never generated to give one vote per person. It is designed purely around economic incentives individuals with more hash power will have provided more investment into the system. These individuals who invest more in the system gain more say in the system. At the same time, no one or even two individuals can gain complete control of the system. We'll explore the nature of cartels in a separately, but these always fail without government intervention. The reason for cartels failing comes down to the simple incentivisation of the most efficient member. The strongest cartel member always ends up propping up the weakest. This leads to a strategy of defection.
No proof of work-based solution ever allows for a scenario where you have one vote to one person. The anti-sybiling functions of bitcoin and all other related systems based on proof of work or similar derivatives are derived from an investment based strategy. Solutions to the implementation of ASIC based systems are constantly proposed as a methodology of limiting the centralisation of proof of work systems as it is termed. The truth of the matter is that the mining function within any proof of work system naturally aligns to business interests. This leads to corporations running machines within data centres. On the way that democracies and republics have migrated away from small groups of people individually voting for an outcome towards a vote for a party, the transactional costs associated with individual choice naturally leads to corporate solutions. In this, the corporation mirrors a political party.
In this paper, we address the issues of using alternate approval work systems with regards to either incorporating alternate functions in an extension of simply securing the network against the use of proof of work systems to create a one person one vote scenario in place of economic incentivisation. We will demonstrate conclusively that all systems migrate to a state of economic efficiency. The consequence of this is that systems form into groups designed to maximise returns. The effect is that bitcoin is not only incentive compatible but is optimal. No system can efficiently collapse into an order of one vote one individual and remain secure. In the firm-based nature of bitcoin, we demonstrate that the inherent nature of the firm is reflected within mining pools. Multiple aggregation strategies exist. The strategies range from the creation of collective firms where members can easily join or leave (mining pools) through to more standard corporate structures
Proof of Work as it relates to the theory of the firm. that are successful within any proof of work system. The system was determined to be based on one- vote per CPU (Satoshi, 2008) and not one vote per person or one vote per IP address. The reasons for this is simple, there is no methodology available that can solve byzantine consensus on an individual basis. The solution developed within bitcoin solves this economically using investment. The parties signal their intent to remain bound to the protocol through a significant investment. Those parties that follow the protocol are rewarded. The alternative strategy takes us back to the former and failed systems such as e-cash that could not adequately solve Sybil attacks and decentralise the network. Bitcoin manages to maintain the decentralise nature of the network through a requirement that no individual party can ever achieve more than 50% of the network hash rate.
In all proof of work systems, there are requirements to inject a costly signal into the network that is designed as the security control. To many people, they believe that the cryptographic element, namely the hashing process is the security feature of bitcoin. This is a fallacy, it is the economic cost that is relevant to the overall system and not the individual element.
The benefits of a hash function are that they are difficult to solve in the nature of the proof of work algorithm but are easy to verify. This economic asymmetry is one of the key features of bitcoin. Once a user has found a solution, they know it can be quickly broadcast and verified by others. Additionally, the hash algorithm provides a fair distribution system based on the amount of invested hash rate. The distinction from proof of stake solution as has been proposed comes in the requirement to constantly reinvest. A proof of stake system requires a single investment. Once this investment is created, the system is incentivised towards the protection of the earlier investment. This leads to a scenario known as a strategic oligopoly game.
The solution using a proof of work algorithm is the introduction of an ongoing investment. This is different to an oligopoly game in that sunk cost cannot make up for continued investment. In a proof of stake system, prior investment is crystallised allowing continued control with little further investment. Proof of work differs in that it requires continuous investment. More than this, it requires innovation. As with all capitalist systems, they are subject to Schumpeterian dynamical change (Shumpeter, 1994). The system of creative destruction allows for cycles of innovation. Each innovation leads to waves of creation over the destruction of the old order.
This process creates continued growth. Proof of work-based systems continue to grow and continue to update and change. Any incumbent corporation or other entity needs to continue to invest knowing that their continued dominance is not assured. In bitcoin, we have seen innovative leaps as people moved from CPU-based mining into GPU-based systems. This initial innovation altered the software structure associated with the mining process in bitcoin. That change significantly altered the playing field leading to novel techniques associated with FPGAs and later ASICs dedicated to a specific part of the mining process.
The error held by many people is that this move from a CPU-based solution into more costly implementations could have been averted. A consequence of this has been the introduction of alternative proof of work systems into many of the alt-coins
These systems have been implemented without the understanding that it is not the use of ASICs that is an issue. It is that the belief that individual users can individually mine in a mesh system will be able to be implemented as a successful proof of work. In the unlikely event that a specialised algorithm was implemented that could only run once on any one machine CPU, it would still lead to the eventual creation of corporate data centres for mining. In the section above, we showed using Arrow’s theorem how only a single use proof of work system can be effective. If we extend this and look at the Theory of the Firm (Coase, 1937) we note that in a system in Litecoin and Dogecoin for example. A00137:
Proof of Work as it relates to the theory of the firm. of prices, reduction could be carried out without any organisation. One issue against this arises from the cost of information. Interestingly, as we move into a world of increasingly more information, it becomes scarce information that is important. As the amount of information becomes more voluminous, the ability to uncover accurate and timely information becomes scarcer. The ability to specialise in the coordination of the various factors of production and the distribution of information leads towards vertical integration within firms. We see this first voiced in Adam Smith’s (Smith, 1776) postulation on the firm:
Everyone can choose to either seek further information or act on the information that they already have. This information can be in the form of market knowledge, product knowledge, or expertise, but at some point, the individual needs to decide to act. There is a cost to obtaining information. The returns on obtaining more information hit a maximum level and start to decrease at a certain point. The entrepreneur acts as a guiding influence managing the risk associated with incomplete information compared to the risk of not acting but rather waiting to obtain more information.
In the instance of bitcoin mining, the firm can increase in size through the integration of multiple specialist roles. Even given the assumption that any one process can run on but a single CPU, we come to the scenario of high-end datacentre servers. The Intel Xeon Phi 7290f implements 72 Atom CPU Cores. Each core runs two threads. Even taking the control system into account, this leaves 142 processes able to run per system. With four cards per RU this allows for datacentre implementations of 5,964 mining processes to run on a pure CPU-based proof of work implementation. One person can manage a small number of mining server implementations within a home or small business environment. In large data centre-based organisations such as Facebook, a single administrator can run 20,000 servers
The effect of this would be one individual managing 2,840,000 individual CPU-based mining processes. This alone is outside the scaling capabilities of any individual. This can be further enhanced as cost savings through the creation of large data centres, management savings and integrating multiple network and systems administrators is considered. As we start to add additional layers we come to a maximum where it is no longer profitable to grow the firm in size. Right up until that point, the firm will grow.
submitted by cryptorebel to btc [link] [comments]

GPU acceleration of full nodes like Bitcoin Core? (NOT mining)

Hi all – Do you think Bitcoin Core and other full node implementations might benefit from GPU acceleration?
Note that I don't mean mining – I mean pure node operations, validation, etc.
By GPU acceleration I mostly mean the use of APIs like OpenCL, DirectX / DirectMath, CUDA, and Apple's Metal. Some of the work currently done by the CPU would be offloaded to the GPU, probably using one of those APIs.
I've read different accounts of how much a full node taxes a CPU, from barely at all to major CPU usage. I guess GPU acceleration would make the most sense if CPU usage was sometimes a problem, and if some part of a full node's computational work could actually be performed by a GPU. And there's where I just don't know enough to know the answer, hence my question today.
Because it's not mining, I think even integrated GPUs like Intel's or the ones that come with Raspberry Pi Arm SOCs could be helpful, if GPUs would be helpful at all. Intel has some great open source drivers for things like OpenCL, and it might be fun to mess around with them.
Thanks for your thoughts.
submitted by Solar111 to Bitcoin [link] [comments]

Mining ERC-918 Tokens (0xBitcoin)

GENERAL INFORMATION

0xBitcoin (0xBTC) is the first mineable ERC20 token on Ethereum. It uses mining for distribution, unlike all previous ERC20 tokens which were assigned to the contract deployer upon creation. 0xBTC is the first implementation of the EIP918 mineable token standard (https://eips.ethereum.org/EIPS/eip-918), which opened up the possibility of a whole new class of mineable assets on Ethereum. Without any ICO, airdrop, pre-mine, or founder’s reward, 0xBitcoin is arguably the most decentralized asset in the Ethereum ecosystem, including even Ether (ETH), which had a large ICO.
The goal of 0xBitcoin is to be looked at as a currency and store of value asset on Ethereum. Its 21 million token hard cap and predictable issuance give it scarcity and transparency in terms of monetary policy, both things that Ether lacks. 0xBitcoin has certain advantages over PoW based currencies, such as compatibility with smart contracts and decentralized exchanges. In addition, 0xBTC cannot be 51% attacked (without attacking Ethereum), is immune from the “death spiral”, and will receive the benefits of scaling and other improvements to the Ethereum network.

GETTING 0xBITCOIN TOKENS

0xBitcoin can be mined using typical PC hardware, traded on exchanges (either decentralized or centralized) or purchased from specific sites/contracts.

-Mined using PC hardware

-Traded on exchanges such as


MINING IN A NUTSHELL

0xBitcoin is a Smart Contract on the Ethereum network, and the concept of Token Mining is patterned after Bitcoin's distribution. Rather than solving 'blocks', work is issued by the contract, which also maintains a Difficulty which goes up or down depending on how often a Reward is issued. Miners can put their hardware to work to claim these rewards, in concert with specialized software, working either by themselves or together as a Pool. The total lifetime supply of 0xBitcoin is 21,000,000 tokens and rewards will repeatedly halve over time.
The 0xBitcoin contract was deployed by Infernal_Toast at Ethereum address: 0xb6ed7644c69416d67b522e20bc294a9a9b405b31
0xBitcoin's smart contract, running on the Ethereum network, maintains a changing "Challenge" (that is generated from the previous Ethereum block hash) and an adjusting Difficulty Target. Like traditional mining, the miners use the SoliditySHA3 algorithm to solve for a Nonce value that, when hashed alongside the current Challenge and their Minting Ethereum Address, is less-than-or-equal-to the current Difficulty Target. Once a miner finds a solution that satisfies the requirements, they can submit it into the contract (calling the Mint() function). This is most often done through a mining pool. The Ethereum address that submits a valid solution first is sent the 50 0xBTC Reward.
(In the case of Pools, valid solutions that do not satisfy the full difficulty specified by the 0xBitcoin contract, but that DO satisfy the Pool's specified Minimum Share Difficulty, get a 'share'. When one of the Miners on that Pool finds a "Full" solution, the number of shares each miner's address has submitted is used to calculate how much of the 50 0xBTC reward they will get. After a Reward is issued, the Challenge changes.
A Retarget happens every 1024 rewards. In short, the Contract tries to target an Average Reward Time of about 60 times the Ethereum block time. So (at the time of this writing):
~13.9 seconds \* 60 = 13.9 minutes
If the average Reward Time is longer than that, the difficulty will decrease. If it's shorter, it will increase. How much longer or shorter it was affects the magnitude with which the difficulty will rise/drop, to a maximum of 50%. * Click Here to visit the stats page~ (https://0x1d00ffff.github.io/0xBTC-Stats) to see recent stats and block times, feel free to ask questions about it if you need help understanding it.

MINING HARDWARE

Presently, 0xBitcoin and "Alt Tokens" can be mined on GPUs, CPUs, IGPs (on-CPU graphics) and certain FPGAs. The most recommended hardware is nVidia graphics cards for their efficiency, ubiquity and relatively low cost. As general rules, the more cores and the higher core frequency (clock) you can get, the more Tokens you will earn!
Mining on nVidia cards:
Mining on AMD cards:
Mining on IGPs (e.g. AMD Radeon and Intel HD Graphics):
Clocks and Power Levels:

MINING SOFTWARE AND DESCRIPTIONS

For the most up-to-date version info, download links, thread links and author contact information, please see this thread: https://www.reddit.com/0xbitcoin/comments/8o06dk/links_to_the_newestbest_miners_for_nvidia_amd/ Keep up to date for the latest speed, stability and feature enhancements!
COSMiC Miner by LtTofu:
SoliditySha3Miner by Amano7:
AIOMiner All-In-One GPU Miner:
TokenMiner by MVis (Mining-Visualizer):
"Nabiki"/2.10.4 by Azlehria:
~Older Miners: Older and possibly-unsupported miner versions can be found at the above link for historical purposes and specific applications- including the original NodeJS CPU miner by Infernal Toast/Zegordo, the '1000x' NodeJS/C++ hybrid version of 0xBitcoin-Miner and Mikers' enhanced CUDA builds.

FOR MORE INFORMATION...

If you have any trouble, the friendly and helpful 0xBitcoin community will be happy to help you out. Discord has kind of become 0xBTC's community hub, you can get answers the fastest from devs and helpful community members. Or message one of the community members on reddit listed below.
Links
submitted by GeoffedUP to gpumining [link] [comments]

First home server; will my plan accomplish my goals?

I'm planning to build my first home server, and I'd love some feedback on my plans before I buy all the hardware. Can you folks help me with some feedback?
 
What I Want to Do with My Hardware
 
Constraints
 
Current Plan
 
Currently Planned Hardware
Type Item Price
CPU Intel - Xeon E5-2660 V2 2.2 GHz 10-Core Processor $192.97 @ PCM
CPU Intel - Xeon E5-2660 V2 2.2 GHz 10-Core Processor $192.97 @ PCM
CPU Cooler Noctua - NH-D14 SE2011 CPU Cooler $89.99 @ Amazon
CPU Cooler Noctua - NH-D14 SE2011 CPU Cooler $89.99 @ Amazon
Thermal Compound Thermal Grizzly - Aeronaut 3.9 g Thermal Paste $11.59 @ Amazon
Motherboard ASRock - EP2C602-4L/D16 SSI EEB Dual-CPU LGA2011 Motherboard $481.98 @ Newegg
Memory Crucial - 32 GB (2 x 16 GB) Registered DDR3-1866 Memory $159.99 @ Amazon
Storage Western Digital - Blue 1 TB 2.5" Solid State Drive $114.89 @ OutletPC
Storage Western Digital - Blue 1 TB 2.5" Solid State Drive $114.89 @ OutletPC
Storage Western Digital - Red Pro 8 TB 3.5" 7200RPM Internal Hard Drive $140.00
Storage Western Digital - Red Pro 8 TB 3.5" 7200RPM Internal Hard Drive $140.00
Storage Western Digital - Red Pro 8 TB 3.5" 7200RPM Internal Hard Drive $140.00
Storage Western Digital - Red Pro 8 TB 3.5" 7200RPM Internal Hard Drive $140.00
Storage Western Digital - Red Pro 8 TB 3.5" 7200RPM Internal Hard Drive $140.00
Storage Western Digital - Red Pro 8 TB 3.5" 7200RPM Internal Hard Drive $140.00
Video Card Asus - GeForce GTX 1060 6GB 6 GB Strix Video Card $359.98 @ B&H
Case Phanteks - Enthoo Pro Tempered Glass ATX Full Tower Case $122.00 @ Amazon
Power Supply Corsair - HX Platinum 750 W 80+ Platinum Certified Fully Modular ATX Power Supply $99.99 @ Newegg
Sound Card Creative Labs - Sound Blaster Z 30SB150200000 OEM 24-bit 192 kHz Sound Card $90.77 @ OutletPC
Prices include shipping, taxes, rebates, and discounts
Total (before mail-in rebates) $3002.00
Mail-in rebates -$40.00
Total $2962.00
 
Hardware Notes
 
Budget
 
Other Notes
 
My main question is: will this hardware and software setup accomplish my goals?
My secondary question is: is any of my hardware unnecessary for my goals? are there better ways to eat this Reese's?
Thanks so much for all the help in advance, I've learned so much from this subreddit (and DataHoarding) already!
submitted by therightrook to homelab [link] [comments]

Cryptocurrency Mining History : Journey to PoC

Cryptocurrency just like any other technological development has given birth to many side industries and trends like ICO, white paper writing, and mining etc… just the cryptocurrency itself rises, falls and changes to adapt real life conditions, so does its side industries and trends. Today we are going to be focusing on mining. How it has risen, fell and adapted through the journey of cryptocurrency till date.
Without going into details crypto mining is the process by which new blocks are validated and added to the blockchain. It first took to main stream in January 2009 when the mysterious Satoshi Nakamoto launched the bitcoin white paper within which he/she/they proposed the first mining consensus mechanism called proof of work (Pow).
The PoW consensus mechanism required that one should spend a certain amount of computational power to solve a cryptographic problem (nounce) in other to have the have the right to pack/verify the next block on the blockchain. In this mechanism, the more computational power one possesses the more rights they have over the packing of the next block. The quest for faster hardware has seen significant changes in the types of hard ware dominating the PoW mining community.
Back in 2009 when bitcoin first started a normal pc and its processing power worked just fine. In fact a pc with an i7 Intel processor could mine up to 50btc per day but back then it almost nothing since btc was only some few cents. When the difficulty of the network became significantly high, simple computer processing units could not match the competitiveness and so miners settled for something more powerful, the high end graphic processors (GPU). This is when the era of rigs began It was in 2010. People would combine GPUs together in mining rigs on a mother board usually in order of 6 per rig some miners operated farms containing many of these rigs. Of course with greater power came greater network difficulty and so the search for faster hard ware let to implementation of Field Programmable Gate Arrays (FPGA) in June 2012. A further search for faster, less consuming and cheaper hard ware let us to where we are today. In the year 2013, Application Specific Integrated Circuits (ASIC) miners were introduced. One ASIC miner processes 1500H/s which is 100 times processing power of CPU and GPU. But all this speed and efficiency achievements brought about another problem one which touches the core of cryptocurrency itself. The idea of decentralization was gradually fading away as wealthy and big companies are the once who could afford and build the miners therefore centralizing mining around the rich, there was a called for ASIC resistant consensus mechanism.
A movement for ASIC resistant PoW algorithms began the idea is to make ASIC mining impossible or at least make it such that using ASIC doesn’t give a miner any additional advantage as to using CPU . In 2013 the MONERO the famous privacy coin proposed CryptoNight an ASIC resistant PoW consensus at least that is how they intended it to be. But things have proven much more difficult in practice than they had anticipated as ASIC producers keep matching up to every barrier put in place the PoW designers at a rate faster than it takes to build these barriers. MONERO for example has to fork every now and then in other to keep the CryptoNight ASIC resistant a trick which is still not working as reported by their CEO “We [also] saw that this was very unsustainable. … It takes a lot to keep [hard forking] again and again for one. For two, it may decentralize mining but it centralizes in another area. It centralizes on the developers because now there’s a lot of trust in developers to keep hard forking.” Another PoW ASIC resistance algorithm is the RamdonX and there are many others but could quickly imagine that the barriers to ASIC mining in these ASIC resistance algorithm would eventually be broken by the ASIC miners and so a total shift from PoW mining to other consensus mechanisms which are ASIC resistance from core were proposed some of which are in use today.
Entered the Proof of Stake (PoS) consensus mechanism. PoS was first introduced in 2013 by the PeerCoin team. Here, a validator’s right to mine is proportionate to his/heit economic value in the network simple put the more amounts of coins you have the more mining rights you get. Apart from PeerCoin, NEO and LISK also use POS and soon to follow is EThereum. There are different variations to PoS including but not limited to delegated proof of stake DPoS, masternode proof of stake MPoS each of which seek to improve on something in the POS. This is a very good ASIC resistance consensus mechanism but it still doesn’t solves the centralization problem as the rich always have the power to more coins and have more mining rights plus it is also expensive to start. And then we have gotten many other proposals to combat this among which are Proof of Weight (PoW) and Proof of Capacity (PoC). We take more interest in PoC it is the latest and gives the best solution to all our mining challenges consensus as of now.
Proof of Capacity was first was described 2013 in the Proofs of Space paper by Dziembowski, Faust, Kolmogorov and Pietrzak and it is now being used in Burst. The main factor that separates all the mining mechanisms is the resource used. These resources which miners spend in other to have mining rights is a measure of ensuring that one has expense a none-trivial amount of effort in making a statement. The resource being spent in PoC is disk space. This is less expensive since many people already have some unused space lying around and space is a cheap resource in the field of tech. it has no discrimination over topography… it really solves lots of centralized problems present in all most other consensus. If the future is now then one could say the future of crypto mining is PoC.
submitted by seekchain to u/seekchain [link] [comments]

Legit Bitcoin Mining sites 2020  New Bitcoin Mining site 2020  Bitcoin Mining site  Tech Alibhai Intel Wants In On Bitcoin Mining? How To Mining Bitcoin From CPU And GPU By Nicehash.com BitTube Review [BIG GPU Mining Potential!] Will It Mine?! Intel Made A GPU

The State of Bitcoin Mining. Cryptocurrency mining is the process of solving blocks by using the capacities of special equipment. For each solution found, the user (miner) receives a reward in the form of coins. Specialized equipment includes ASICs, GPU, and CPU (the latter is the least profitable). Profit Switching Miner Controller Cryptonight Lite V7 Dagger Hashimoto Bitcoin tips 280x pack cpuminer gridseed fixed X16r Nvidia miner cpuminer-opt 3.3 windows Mark Karpeles ghs automated trading bot HVC cudaminer Fury lolMiner 0.8 AMD GPU miner cpuminer sandor111 lyra2re nvidia PASC mining pool BtcDrak VertCoin test pool GTX 1060 9Gbps mining make offer - bitcoin mining rig - 8 gpu, alt coins, pro crypto currency miner *bit punisher* Crypto Coin Mining Rig 8x GTX1060 185 MH/s ETH Ethereum 2300 Sol/s Zcash ZEC BTC $2,849.05 Bitcoin mining rigs and systems have come a long way since the beginning. The first Bitcoin miners made do with the tools they had at their disposal and set up various software to control the mining hardware in their rigs. or restart a GPU if it has frozen. Quick Tip. Mining is not the fastest way to get bitcoins. Buying bitcoin is the Best mining GPU 2020: the best graphics cards for mining Bitcoin, Ethereum and more By Matt Hanson , Michelle Rae Uy 28 April 2020 Join the cryptocurrency craze with the best mining GPUs

[index] [23102] [20807] [5354] [14402] [14636] [1485] [27010] [3663] [28460] [29433]

Legit Bitcoin Mining sites 2020 New Bitcoin Mining site 2020 Bitcoin Mining site Tech Alibhai

In this video i can show you that how to Start Bitcoin Mining Using GPU or CPU in 2020. Nicehash miner is the easiest way to start mining, also you don't need a lot of experience. Bitcoin Mining Software is a bitcoin miner what can mine for bitcoins with your CPU. Yes,not with GPU but with CPU.Why? This Bitcoin Mining Software can mine with your computer or laptop CPU at ... Next, login and go to the bottom and select the CPU GPU mining option. In the next window, choose to download the nicehash crypto miner. ... Noob's Guide To Bitcoin Mining - Super Easy & Simple ... Discover not only my TOP 5 CRYPTO MINING COINS to mine for 2020, but my SLEEPER PICK as well! Interested? Click and Watch NOW. Subscribe for more awesome videos and a chance at Free Bitcoin! http ... Intel BX80662G4400 Pentium Processor G4400 3.3 GHz FCLGA1151: ... bitcoin, cryptocurrency, BitTube profitability, BitTube price, bitcoin fridays, speculative mining, GPU mining, gpu mining rig ...

Flag Counter