Today I will summarize you a few knowledge about Blockchain Technology !!submitted by mkristen024 to ICOAnalysis [link] [comments]
Blockchain, the preliminary block chain name is a hierarchical database that shops records in blocks of statistics that are related via encryption and make bigger over time. Each data block contains facts about the initialization time and is connected to the previous block, with a time code and transaction information. Blockchain is designed to withstand data change: Once the information is usual with the aid of the community, there may be no manner to change it.
Blockchain is guaranteed via the layout the usage of hierarchical computing machine with high tolerance of byzantine mistakes. So decentralized consensus can be done by Blockchain. So Blockchain is appropriate for recording activities, scientific statistics, transaction processing, notarization, identification and proving beginning. This has the potential to assist put off primary results while information is modified within the context of world trade.
The first blockchain changed into invented and designed by means of Satoshi Nakamoto in 2008 and changed into realized the following 12 months as a center a part of Bitcoin, while blockchain technology acts as a ledger for all deal.
Through the usage of peer networks and a hierarchical statistics device, Bitcoin blockchain is controlled robotically. The invention of the blockchain for Bitcoin has made it the first digital currency to resolve the double spending trouble when a single amount of money is used two times. This technology of Bitcoin has become an proposal for a variety of different applications.
Blockchain and cryptocurrency: Special blockchain promotes its strength in dealing with and trading on-line assets which are currently the maximum popular bitcoin. Since there's handiest a confined number of (21 million bitcoins), however the want to spend money on bitcoin as a source of asset replacements for gold, foreign currencies and shares.
Constantly escalating, the rate of bitcoin has extended dramatically. Uncontrolled by means of the authorities, without inflation and having to exploit like gold, this currency is the supply of rescuing Venezuela's hyper-inflationary country - where humans hardly ever buy something with the neighborhood currency but owns the largest bitcoin mines within the global. However, bitcoin also has a dilemma trouble owned by the network that is nonetheless debating the answer.
In the past few years, the Blockchain generation has been used as a device to document the records of cryptocurrency transactions. Bitcoin turned into a success and grew unexpectedly when it convinced the fastidious customers. Bitcoin is surely treasured and it's far traded and saved securely online. Everyone thinks that Bitcoin will open a new generation for industrial revolution, the 4.0 commercial revolution in which blockchain technology is widely used.
The destiny of Blockchain era: The emergence of Blockchain as well as milestones when personal computer systems or Internet are born, this system will trade the manner we apprehend and understand society. The biggest capability is to create a place to use Smart Contract: agreements in contracts and transactions will be confirmed without disclosing data between events with a positive middleman while ensuring the entirety is the maximum obvious and sure.
Information in Blockchain can't be counterfeited (but may additionally nonetheless go away strains), all modifications want to get consensus of all taking part nodes within the machine. It is a gadget that does not easily disintegrate, due to the fact even supposing one a part of the community is numb, the opposite nodes will continue to function to protect the data. Blockchain technology opens a new trend for regions along with banking and finance, logistics, electronics and telecommunications, accounting and auditing.
Not simplest that Blockchain is likewise the center of Internet of Things (IoT). Electronic devices can talk thoroughly and transparently, unrighteous efforts within the Internet international will no longer paintings, and lots greater ...Currently there are numerous large organizations and companies which are constructing their own Blockchain network. So we can quickly see this could create a wave for the future.
GOLD: Gold has been broadly used around the sector as a means of foreign money conversion, both by using issuing and spotting gold cash or other steel numbers, or via paper money equipment. Converted into gold by developing a gold popular wherein the full fee of the money issued is represented by a gold reserve.
Do you trust Gold could be digitized? I am speaking about the blockchain utility challenge in gold digitization. DIGITAL GOLD challenge will turn physical gold into virtual gold. It may be said that that is a assignment full of formidable thoughts that I even have never visible however also promising and creative. This task helps bridge the space among gold and different digital ecosystems.
DIGITAL GOLD Securely Store and Make payments in GoldDIGITAL GOLD is a task evolved based on blockchain technology. The intention of this project is to digitize the financial marketplace and at the identical time digitize their improvement funding units. The Digital Gold (link https://gold.Storage/) project will launch Gold token based on Ethereum ERC-20 and be sponsored through bodily gold. Users can buy Gold without delay with none drawback. 1 GOLD token might be equal to one gram of physical gold 99.99%.
And the special aspect that customers can accept as true with is physical gold stored inside the company's warehouse. This makes customers in reality consider in the business enterprise's mission. Customers who purchase tokens can change not directly (transactions are done quickly due to the fact the challenge is carried out with blockchain era, does not require complex methods like normal transactions), can store it as saved store your house.
Another super aspect, while the client buys GOLD token, the value of that token can be constant in step with the physical gold price at that point. So GOLD token is similar to Stablecoin, which enables defend customers towards marketplace volatility inside the cryptocurrency market, even as helping customers advantage from gold's long-term charge increases.
Currently, the utility of Blockchain era into payment is turning into extra famous and gold digitization will genuinely growth the attraction for this uncommon metallic. This will create a liquidity for GOLD token, ensuring that always, where liquidity is constantly maintained, customers can without difficulty capture the quantity of Gold tokens presently circulating inside the marketplace and the number of Token How plenty is saved in stock? Liquidity is the survival of the project.
The application of Blockchain generation into Digital Gold task: the application of blockchain era into this venture helps to enhance the transaction capability, the transactions are made speedy, the transaction charge is very low, implemented via Ethereum block.
The protection of patron records is likewise guaranteed from network assaults, clever settlement applications, supports all ERC20 wallets and Gold tokens on buying and selling platforms (accomplice platform). With the improvement of blockchain technology, gold can now act as a charge for all transactions, gold is not only stored however additionally traded as an investment tool.
Official Website : https://gold.Storage/ White paper: https://gold.Storage/wp.Pdf Telegram: https://t.Me/digitalgoldcoin Twitter: https://twitter.Com/golderc20
Author: Cryptobae10 https://bitcointalk.org/index.php?action=profile;u=2023123
Technology Breakthroughs 2019submitted by crewartproduction to u/crewartproduction [link] [comments]
what is technology:
technology is the knowledge of techniques and processes mixed with skills and methods that enables human to apply this knowledge, technology had started by the pre-historic man when he used a very simple tools to secure his living from food and protection, then technology uses evolved from just a surviving tool to become the backbone of living that interfered in every single detail of human daily life, from the Toothpick to the space-traveling that is like a dream coming true as their rich tourists had made a space trip and its expected that it will be common and affordable trip in the upcoming years, so let’s talk about the most exciting technology breakthroughs in 2019.
2019 is going to witness the releasing of the fifth generation of cellular mobile communication (5G) that featured with reduced latency, energy saving, cost reduction, higher system capacity, and massive device connectivity, and it’s the natural evolution of the 4G Technology that had been released on 2009 in Sweden and that was a great breakthrough in the history of mobile internet that the internet speed could reach 100 megabytes per second for high mobility mobile communication, so let’s get back to 5G technology that its speed is expected to reach 2500 MB (Megabyte) per second as peck performance, 5G mobile devices had seen a great improvement in technicality as it depend on a cell communicate by radio waves with a local antenna array and automated transceiver in the cell with low power consumption rate the antenna is connected with the telephone network through optical fiber cables or wireless backhaul connection, the devices that will equip the 5G technology is going to have 4G LTE capability as the 5G access is not available everywhere till now, the 5G is going to be used in the autonomous vehicles to support it with data about the surrounding environment in real time and nearby vehicles that could exchange their locations and intentions, the roadway is also able to deliver traffic conditions immediately ahead, which will ease the task of driving, also 5G technology is going to be compatible for laptops computers to enhance its Internet communication.
Virtual Reality (VR):
Virtual reality (VR) technology had started in the early 1950th as Morton Heling wrote about the “theater experience”, and built a prototype of Sensorama vision dubbed in 1962, this device displays engaged multiple scenes, then he developed the “Telesphere Mask”, the year 1968 had seen the releasing of the first head mounting display (HMD) system for simulation applications by Douglas Engelbart and other scientists, the 70th had seen a great leap in the virtual reality technology by interfering in many fields like medical, military and even space science as David Em was the first one to produce navigable virtual worlds in NASA laboratory, in 1979 Eric Howlett developed the Extra Perspective (LEEP) optical system which make the virtual reality helmets available today, 1991 had seen the invention of the first cubic impressive room allowing people to see their own bodies in relation to each other’s in the room, between the years 1982 and 1992 Nicole Stenger created the first real time interactive impressive movie that viewed by using a dataglove and a high resolution googles, then the new millennium comes with new update in VR technology, by the year 2001 the first pc based on cubic room had been invented, in the year 2007 google had released the street panoramic view by using 3D stereoscopic mode, 2013 had seen the start of using the virtual reality technology in smart phones by using a headset that used in virtual and augmentation of reality, the Oculus VR project had seen the light Valve Corporation, in the year 2014 Facebook had bought the Oculus VR for 2 Million dollar, and also Sony had announced the launch of the PlayStation VR project, In the year 2015 Google had announced the launch of cardboard a do-it-yourself stereoscopic viewer for smartphones, 2016 had seen a great leap in the VR technology development by many companies specially HTC which had shipped the Vive Steam VR headset as the major commercial release in this year, Sony had continued the development of the wireless headset and evolved it and released the Vive for PlayStation VR. 2019 is going to see a great leap in VR technology as Sony is working to release 3D Rubber Motion Controller for PlayStation VR, SOMNIACS company is working on Somniacs Birdly VR Flying Simulator, but this will not be the last update in the VR technology.
Artificial intelligence (AI) technology
Artificial intelligence concept had started to be known from the year 1940 by the invention of the programmable digital computer, the machine based on abstract essence of mathematical reasoning, Dartmouth college had founded the AI research field in the year 1956, by the year 1973 the artificial inelegance had faced a new challenge from the USA and the British governments as they stopped the funding undirected research into artificial intelligence, the investors started to take the role of the governments in the funding this research but by the 80th they started to withdraw from this business because of the absence of the needed computer power (hardware), but this all had changed by the beginning of the 21 century as that the machine learning was successfully solved a lot of academic and industrial problems, the AI development in the new millennium started with the creation of the interactive toys (smart toys) in the year 2000, in the year 2004 Nasa had navigated the surface of Mars with autonomously robotic exploration rovers, in 2005 Honda released a humanoid robot that is able to walk as fast as humans and he worked as a food servant in restaurant, but the year 2005 had seen a new thinking in the AI technology by the initiation of the blue brain project that aimed to identify the structure of human brain and detect its function in disease and health, between years 2010 and 2014 the humans are being able to communicate and even speak with robots as in 2011 Apple had launched the Siri project and in year 2012 google had launched Google Now, and then the turn comes on Microsoft that released the smartphone apps the uses the natural language to answer the received questions, make recommendations and even perform actions, Artificial Intelligence (AI) Technology is going to see a great leap in 2019 as this year will see the release of:
1- Automated CCTV security cameras:
that featured by:
A new concept that will be released in 2019 that will make your AI smarter as an example it could be able to distinguish between the real image or a realistic image that is created by AI technology.
Chatbot that is also known as Smartbot, Talkbot, chatterbot, IM Bot, interactive agent, conventional interface and artificial conversational entity, the Chatbot concept is an artificial intelligence which conducts a conversation via auditory or textual methods, this technology had started in the early 20th century in the year 1960, but in the middle of the century that this technology had turned from simple conversions to powerful promotional tools for media brands and e-commerce businesses, 80% of companies are expected to be using the Chatbot by the year 2020.
3D printing is a number of various computer controlled processes to create a three-dimension object is made by joining and solidifying the material, 3D printing had started to enter the production industry in the early 80th particularly in 1981 when the Japanese scientist Hideo Kodama has invented two methods for fabricating three dimensional plastic models with photo hardening method, the use of 3D printing in production started with functional or aesthetical prototypes then it turned to be additive manufacturing that it can be used in manufacturing a very complex shape or geometry that is pre designed by 3D model or a CAD file, 2019 is going to see a leap in 3D printing specially in the metal 3D printing which depend on digital model that uses layer by layer material build-up approach which gives the metal a full density and high precision, the metal 3D printing technology is going to be used in manufacturing aerospace, oil & gas, automobile and marine applications, 3D printing is going to use new materials and enter the phase of mass production with high technology machines, 3D printing will see new types in 2019 by these types:
Fused Deposition Modelling (FDM): is also known as Fused Filament Fabrication which is the most commonly available and cheapest type of 3D printing.
stereolithographic (SLA): which is known as the world first 3D printing technology as its history returns to the year 1986, stereolithographic printing is Featured by using mirrors but it also had a big disadvantage that it takes too long to trace the cross-section of an object when compared to DLP.
Digital Light Processing (DLP): this type of 3D printing uses a digital light projector to flash one image for each layer at once or images of multiple flashes for larger parts, the output of this type of 3D printing is a digital image composed of square pixels resulting from rectangular blocks called Voxels.
Selective laser sintering (SLS): its creating objects by using Powder Bed Fusion technology and polymer powder, this type of 3D printing is going to be commonly used due to its low price and this type is featured by using a CO2 laser beam in scanning the object surface.
Material Jetting (MJ): this type of 3D printing has the same concept of normal inkjet printers but the difference in this printing type is that it builds multiple layers of ink upon each other until it turns to solid part, Material Jetting featured by offering objects made from multi-material printing and with full-color.
Drop On Demand (DOD): this type of 3D printing featured by using a pair of ink Jets, one looks like a wax material and the other used for dissolvable support material, DOD printing featured by using a fly-cutter that skims the build area after the creation of each layer to ensure the commencing of this layer, this type of printing is used in casting the lost wax and suitable for other mold making applications.
Sand binder jetting: mainly this type of 3D printing depends on mixing the PMMA powder with binding liquid as an agent to produce parts colors are added to the mixture through another nozzle, binder jetting useful in production of sand cast molds and cores as they are generally made of artificial sand or (silica), this type of 3D printing is featured with its low cost and quite easily integrated into existing manufacturing of foundry process without disruption.
Metal binder jetting: this type of 3D printing is used in the fabrication of metal objects, as the metal powder is bond together using a more poly binding agent, metal binder jetting is featured by producing complex geometries objects beyond conventional manufacturing techniques, this process is done on main steps of infiltration process than adding bronze is the object then we go for the sintering process, the output of this printing type had an issue of non-uniform shrinkage that is solved in the design stage.
Direct Metal Laser Sintering (DMLS): this type of 3D printing is like SLS type but the difference in that it is applied on metal objects, laser is used in fusing the metal powder at a point reaching molecular level, DMLS printing process needs a structural support as the output object is Vulnerable to distortion and wrapping due to residual stress.
Electron Beam Melting (EBM): high energy Electron beam is used in the metal fusion; EBM printing is featured with superior building speed upon any other 3D printing types because of its high energy density.
Blockchain or blockchain is a growing list of records called blocks using cryptography, blockchains are resistance to modification of the data that was first described in the year 1991 but it was actually created as public transaction ledger of the cryptocurrency (bitcoin) by Satoshi Nakamoto in the year 2008, it is managed by peer to peer network collectivity adhering to a protocol for internal code communication and validating new blocks once recorded, blockchains has many types:
Public blockchains have absolutely no access restrictions, its useful as that anyone has a connection to the internet could send transactions as validator, bitcoins and Ethereum are the most known public blockchains Applications.
Private blockchains feathered with high privacy as there is no one could join the network until he had been invited by the network administrations, and even the participant and validator access is restricted, this type of blockchains is the most appropriate for companies that are interested in Blockchain technology because of the high control level, its main implement is in accounting and record keeping producer’s business.
Consortium blockchains are semi-decentralized as its controlling lies between the hands of many companies might control one node in the network and not just a single organization that controls the network.
Blockchains technology will see a great leap in 2019 as these types will rise:
Blockchain will work as a service (BaaS):
BaaS is a cloud-based service that allows customers to build their own Blockchain powered products and it could be including applications, smart contracts, and other Blockchain features without the need to set up or build Blockchain based infrastructure.
Hybrid Blockchain works by providing the best features and functionality of both private and public blockchains, hybrid Blockchain isn’t widely used but it is considered to be the most appropriate for banks.
Federated Blockchain is the natural evolution of the normal Blockchain, it looks like the private Blockchain but has a more customizable outlook, federated blockchains mainly used in financial services.
Ricardian contracts are the start of the dependence on legal contracts that cryptographically signed and verified, Ricardian contracts provide unique solutions so that they could be understood without mediator or service for both human and computers.
Interoperability between Blockchains:
Interoperability Blockchains aims to improve information across several networks or other Blockchains networks, the cross chain services improve the daily use of Blockchains, 2019 will see an improvement in the interoperability Blockchains technology, the main applications for Interoperability Blockchains are BlockNet, Aion, WanChain, and others.
Stable coins are the side product of the cryptocurrencies, that is affected by the market condition and the stability maintained in all time, most of the stable coins are fiat-chained but they are also backed by commodity stable coins, the main applications for stable coins are everyday currency transaction and P2P payments.
Security tokens had replaced the ICOs because it is more secure and protects the investor’s rights redefines the whole investment process, by the current year 2019 investors will tend to use security tokens (STO) than (ICO).
Financial relegation is a type of relegation or supervision, which targets financial institutions with certain requirements, guidelines and restrictions that aims to maintain integrity of the financial system, this system could be handled by government or non-government organization, financial relegation had initiated by the Dutch authorities in the early modern period on the year 1610 as short selling which means that the buyer doesn’t own the asset but he borrows it from the seller and returns it back after short time with profit, then the Financial relegation took its development way until it took the form of banks in our days, the 2010 financial crisis had affected the relegation in a positive way as regulators put a fourth substantial number of new strengthened regulations and expanded requirements, 2018 had seen a great focus on legislative agenda towards protecting the consumers and investors and encouraging financial technology innovation, 2019 is going to see big leap in the financial relegation field as Asian investors will continue their 2018 financial vision to make the trajectory of embedding global post-crisis reforms, and make the Asia-pacific outlook the trends navigating guide across the region, in Europe 2019 is expected to be the year of continuity of regulatory terms, As the first half of the year will be finalizing the legislative initiatives to complete the banking Union, strengthen the EMU, and Capital markets Union.
largest technology construction projects in 2019
largest technology construction projects in 2019
London Crossrail is the world first continues growing underground system that it extends the railway system with 73 miles (117 kilometers) that links between Berkshire and Buckinghamshire this line holds the name of Elizabeth Line that will be divided into two lines, this huge project expected to cost 23 million Pound, London Crossrail had the approval to start to work on it on 2008 and it’s excepted to launch in autumn 2019.
Benban Solar Park:
Benban solar park will be the world largest photovoltaic power station than expected to generate 1650 (MVP), its located in Upper Egypt particularly in Aswan, the Benban solar park is part of Egypt’s Nubian sun project that aims to be part of generating 20% renewable power of total Egyptian needed power, this Egyptian national project is expected to start working by the end of 2019.
“The root problem with conventional currency is all the trust that's required to make it work. The central bank must be trusted not to debase the currency, but the history of fiat currencies is full of breaches of that trust.” – Satoshi Nakamoto 02/11/2009As we’ve written previously, the genesis block is often cited as a criticism of the 2008 bailout. However, the content of the article outlines that the bailout had already occurred. The article reveals that the government was poised to go a step further by buying up the toxic bank assets as part of a nationalization effort! In this scenario, according to the Times, "a 'bad bank' would be created to dispose of bad debts. The Treasury would take bad loans off the hands of troubled banks, perhaps swapping them for government bonds. The toxic assets, blamed for poisoning the financial system, would be parked in a state vehicle or 'bad bank' that would manage them and attempt to dispose of them while 'detoxifying' the main-stream banking system." The article outlines a much more nightmarish scenario than bank bailouts, one that would effectively remove any element of private enterprise from banking and use the State to seize the bank's assets.
“It would have been nice to get this attention in any other context. WikiLeaks has kicked the hornet's nest, and the swarm is headed towards us.” – Satoshi Nakamoto 12/11/2010Why anyone would want to put our opportunity of sound monetary policy in jeopardy to enable illegal trading at a base protocol level is beyond me. I respect anyone who has an anarcho-capitalist ideology. But, please don’t debase my currency by putting it at risk of legal intervention because you want to impose that ideology on the world.
“It is strictly necessary that the longest chain is always considered the valid one. Nodes that were present may remember that one branch was there first and got replaced by another, but there would be no way for them to convince those who were not present of this. We can't have subfactions of nodes that cling to one branch that they think was first, others that saw another branch first, and others that joined later and never saw what happened. The CPU proof-of-worker proof-of-work vote must have the final say. The only way for everyone to stay on the same page is to believe that the longest chain is always the valid one, no matter what.” – Satoshi Nakamoto 11/09/2008
Ferrum Network – The First High-Speed Interoperability Network for Real-World Financial Applications🔴submitted by aregata5 to BountyICO [link] [comments]
Who would have thought how our world would develop 10-20 and 30 years ago?! Previously, we had no idea that a large part of everyday life will be quite normal phenomenon technologies such as the Internet, mobile and cellular communications, phones and smartphones themselves, as well as blockchain technology with its numerous derivative coins. Initially, when blockchain technology was in its infancy, I don't think Satoshi Nakamoto could have guessed what chain reaction of coins he would launch all over the world. If 10 years ago there was only one cryptocurrency Bitcoin in the world, now there are more than 2000.But despite all this dynamic growth of cryptocurrencies, people still experience a number of difficulties in their operation, as well as due to the lack of practical tools for their use. Of course, many specialists develop a large number of different subsystems that are aimed at solving current problems, but as practice shows, it is still at the stage of origin. However, slowly but surely humanity will sooner or later come to this. The project about which we will talk about today is called FERRUM NETWORK.
About the projectIts main task is to ensure the interconnection of all issued coins with each other. At what to carry out these functions in the best way, offering the fastest translations, as well as minimal fees.The project itself has its own decentralized basis, designed specifically for the tasks and solutions of the platform, where all exchange processes between all cryptocurrencies existing around the world will take place. If we talk about the FERRUM NETWORK project in simple words, it offers us high-frequency and risk-free cryptocurrency trading, supported by the friendly environment of FERRUM blockchain, as well as the built-in AI assistant.
Advantages and featuresAs you have already understood FERRUM NETWORK will be a kind of exchange in which users will be able to quickly manage their assets by buying and selling some coins to others, no matter on what blockchain they are created. All FERRUM payment gateways are designed to provide the highest speed at the lowest possible cost. Which in turn will lead to more highly efficient decentralized exchanges throughout the cryptographic world.To complement the exchange FERRUM will be built-in mobile wallet UniFyre with its derivative branch in the form of a cold hardware wallet SebZero, technical indicators of which are considered one of the most reliable and best hardware wallets of our time.Also, to increase the reliability and security of all exchange processes, the developers of FERRUM NETWORK intend to use their own internal token on the created blockchain, with which all functions of exchanging one cryptocurrency for another will be carried out. The whole process is aimed only at eliminating any fraudulent activity on the part of the participants. All operations will proceed as quickly as possible, so there will be no difficulties and hitches for users.
TokenAs I wrote earlier, for the implementation of all kinds of transactions with cryptocurrencies on FERRUM NETWORK will be available its internal token FRM, which will allow to carry out all transactions quickly, with minimal costs and the most reliable way. It is important to note that FRMconversion will be available for both digital and Fiat currencies. Making use of the FRM is convenient and accessible.The distribution of tokens is as follows:
ConclusionOf course, in order to appreciate all the advantages of the FERRUM NETWORK, it is necessary to get acquainted with it personally in practice, since the whole value of such projects is only in their practical application. Generally speaking, I think this concept is very interesting and technological, because it contains all the necessary technological functions and tools needed in the daily life of millions of users of the blockchain market.For a more advanced analysis of FERRUM NETWORK I suggest you to study their technical document presented in White paper, as well as ask all the questions to the development team. To do this, I have prepared all the necessary resources that you will find just below this article.
Official resources of the Ferrum Network project:📷 WEBSITE: https://ferrum.network 📷 TELEGRAM: http://t.me/ferrum_network 📷 WHITEPAPER:https://drive.google.com/file/d/1chjmvP_Gmj6n9IeV4mGV_BBjY0hCSV9V/view 📷 ANN THREAD: https://bitcointalk.org/index.php?topic=5134952 📷 FACEBOOK: http://facebook.ferrum.network/ 📷 TWITTER: https://twitter.com/FerrumNetwork 📷 MEDIUM: https://medium.com/ferrumnetwork 📷 REDDIT: https://www.reddit.com/FerrumNetwork/ 📷 LINKEDIN: http://www.linkedin.com/company/ferrumnet/ 📷 INSTAGRAM: http://instagram.ferrum.network/ 📷 YOUTUBE:http://www.youtube.com/channel/UCN658dMRTaH4C4dP32VHi6QAUTHORBTT username: Mr.Noda
BTT profile link : https://bitcointalk.org/index.php?action=profile;u=1297923
So I'm Steve Shadders and at nChain I am the director of solutions in engineering and specifically for Bitcoin SV I am the technical director of the project which means that I'm a bit less hands-on than Daniel but I handle a lot of the liaison with the miners - that's the conditional project.Daniel:
Hi I’m Daniel I’m the lead developer for Bitcoin SV. As the team's grown that means that I do less actual coding myself but more organizing the team and organizing what we’re working on.Connor 03:23.07,0:04:15.98
I mean yes well we’ve been in touch with the developer teams for quite some time - I think a bi-weekly meeting of Bitcoin Cash developers across all implementations started around November last year. I myself joined those in January or February of this year and Daniel a few months later. So we communicate with all of those teams and I think, you know, it's not been without its challenges. It's well known that there's a lot of disagreements around it, but some what I do look forward to in the near future is a day when the consensus issues themselves are all rather settled, and if we get to that point then there's not going to be much reason for the different developer teams to disagree on stuff. They might disagree on non-consensus related stuff but that's not the end of the world because, you know, Bitcoin Unlimited is free to go and implement whatever they want in the back end of a Bitcoin Unlimited and Bitcoin SV is free to do whatever they want in the backend, and if they interoperate on a non-consensus level great. If they don't not such a big problem there will obviously be bridges between the two, so, yeah I think going forward the complications of having so many personalities with wildly different ideas are going to get less and less.Cory: 0:06:00.59,0:06:19.59
Sure yeah so our release will be concentrated on the stability, right, with the first release of Bitcoin SV and that involved doing a large amount of additional testing particularly not so much at the unit test level but at the more system test so setting up test networks, performing tests, and making sure that the software behaved as we expected, right. Confirming the changes we made, making sure that there aren’t any other side effects. Because of, you know, it was quite a rush to release the first version so we've got our test results documented, but not in a way that we can really release them. We're thinking about doing that but we’re not there yet.Steve: 0:07:50.25,0:09:50.87
Just to tidy that up - we've spent a lot of our time developing really robust test processes and the reporting is something that we can read on our internal systems easily, but we need to tidy that up to give it out for public release. The priority for us was making sure that the software was safe to use. We've established a test framework that involves a progression of code changes through multiple test environments - I think it's five different test environments before it gets the QA stamp of approval - and as for the question about the testnet, yeah, we've got four of them. We've got Testnet One and Testnet Two. A slightly different numbering scheme to the testnet three that everyone's probably used to – that’s just how we reference them internally. They're [1 and 2] both forks of Testnet Three. [Testnet] One we used for activation testing, so we would test things before and after activation - that one’s set to reset every couple of days. The other one [Testnet Two] was set to post activation so that we can test all of the consensus changes. The third one was a performance test network which I think most people have probably have heard us refer to before as Gigablock Testnet. I get my tongue tied every time I try to say that word so I've started calling it the Performance test network and I think we're planning on having two of those: one that we can just do our own stuff with and experiment without having to worry about external unknown factors going on and having other people joining it and doing stuff that we don't know about that affects our ability to baseline performance tests, but the other one (which I think might still be a work in progress so Daniel might be able to answer that one) is one of them where basically everyone will be able to join and they can try and mess stuff up as bad as they want.Daniel: 0:09:45.02,0:10:20.93
Yeah, so we so we recently shared the details of Testnet One and Two with the with the other BCH developer groups. The Gigablock test network we've shared up with one group so far but yeah we're building it as Steve pointed out to be publicly accessible.Connor: 0:10:18.88,0:10:44.00
That's what did the Gigablock test network is. So the Gigablock test network was first set up by Bitcoin Unlimited with nChain’s help and they did some great work on that, and we wanted to revive it. So we wanted to bring it back and do some large-scale testing on it. It's a flexible network - at one point we had we had eight different large nodes spread across the globe, sort of mirroring the old one. Right now we scaled back because we're not using it at the moment so they'll notice I think three. We have produced some large blocks there and it's helped us a lot in our research and into the scaling capabilities of Bitcoin SV, so it's guided the work that the team’s been doing for the last month or two on the improvements that we need for scalability.Steve: 0:11:56.48,0:13:34.25
I think that's actually a good point to kind of frame where our priorities have been in kind of two separate stages. I think, as Daniel mentioned before, because of the time constraints we kept the change set for the October 15 release as minimal as possible - it was just the consensus changes. We didn't do any work on performance at all and we put all our focus and energy into establishing the QA process and making sure that that change was safe and that was a good process for us to go through. It highlighted what we were missing in our team – we got our recruiters very busy recruiting of a Test Manager and more QA people. The second stage after that is performance related work which, as Daniel mentioned, the results of our performance testing fed into what tasks we were gonna start working on for the performance related stuff. Now that work is still in progress - some of the items that we identified the code is done and that's going through the QA process but it’s not quite there yet. That's basically the two-stage process that we've been through so far. We have a roadmap that goes further into the future that outlines more stuff, but primarily it’s been QA first, performance second. The performance enhancements are close and on the horizon but some of that work should be ongoing for quite some time.Daniel: 0:13:37.49,0:14:35.14
Some of the changes we need for the performance are really quite large and really get down into the base level view of the software. There's kind of two groups of them mainly. One that are internal to the software – to Bitcoin SV itself - improving the way it works inside. And then there's other ones that interface it with the outside world. One of those in particular we're working closely with another group to make a compatible change - it's not consensus changing or anything like that - but having the same interface on multiple different implementations will be very helpful right, so we're working closely with them to make improvements for scalability.Connor: 0:14:32.60,0:15:26.45
I'm often quoted on Twitter and Reddit - I've said before the infinite block attack is bullshit. Now, that's a statement that I suppose is easy to take out of context, but I think the 128 MB limit is something where there’s probably two schools of thought about. There are some people who think that you shouldn't increase the limit to 128 MB until the software can handle it, and there are others who think that it's fine to do it now so that the limit is increased when the software can handle it and you don’t run into the limit when this when the software improves and can handle it. Obviously we’re from the latter school of thought. As I said before we've got a bunch of performance increases, performance enhancements, in the pipeline. If we wait till May to increase the block size limit to 128 MB then those performance enhancements will go in, but we won't be able to actually demonstrate it on mainnet. As for the infinitive block attack itself, I mean there are a number of mitigations that you can put in place. I mean firstly, you know, going down to a bit of the tech detail - when you send a block message or send any peer to peer message there's a header which has the size of the message. If someone says they're sending you a 30MB message and you're receiving it and it gets to 33MB then obviously you know something's wrong so you can drop the connection. If someone sends you a message that's 129 MB and you know the block size limit is 128 you know it’s kind of pointless to download that message. So I mean these are just some of the mitigations that you can put in place. When I say the attack is bullshit, I mean I mean it is bullshit from the sense that it's really quite trivial to prevent it from happening. I think there is a bit of a school of thought in the Bitcoin world that if it's not in the software right now then it kind of doesn't exist. I disagree with that, because there are small changes that can be made to work around problems like this. One other aspect of the infinite block attack, and let’s not call it the infinite block attack, let's just call it the large block attack - it takes a lot of time to validate that we gotten around by having parallel pipelines for blocks to come in, so you've got a block that's coming in it's got a unknown stuck on it for two hours or whatever downloading and validating it. At some point another block is going to get mined b someone else and as long as those two blocks aren't stuck in a serial pipeline then you know the problem kind of goes away.Cory: 0:18:26.55,0:18:48.27
Yes, there have been concerns raised about it. I think what people forget is that compact blocks and xThin exist, so if a 32MB block is not send 32MB of data in most cases, almost all cases. The concern here that I think I do find legitimate is the Great Firewall of China. Very early on in Bitcoin SV we started talking with miners on the other side of the firewall and that was one of their primary concerns. We had anecdotal reports of people who were having trouble getting a stable connection any faster than 200 kilobits per second and even with compact blocks you still need to get the transactions across the firewall. So we've done a lot of research into that - we tested our own links across the firewall, rather CoinGeeks links across the firewall as they’ve given us access to some of their servers so that we can play around, and we were able to get sustained rates of 50 to 90 megabits per second which pushes that problem quite a long way down the road into the future. I don't know the maths off the top of my head, but the size of the blocks that can sustain is pretty large. So we're looking at a couple of options - it may well be the chattiness of the peer-to-peer protocol causes some of these issues with the Great Firewall, so we have someone building a bridge concept/tool where you basically just have one kind of TX vacuum on either side of the firewall that collects them all up and sends them off every one or two seconds as a single big chunk to eliminate some of that chattiness. The other is we're looking at building a multiplexer that will sit and send stuff up to the peer-to-peer network on one side and send it over splitters, to send it over multiple links, reassemble it on the other side so we can sort of transition the great Firewall without too much trouble, but I mean getting back to the core of your question - yes there is a theoretical limit to block size propagation time and that's kind of where Moore's Law comes in. Putting faster links and you kick that can further down the road and you just keep on putting in faster links. I don't think 128 main blocks are going to be an issue though with the speed of the internet that we have nowadays.Connor: 0:21:34.99,0:22:17.84
It's interesting the decision - we were initially planning on removing that cap altogether and the next cap that comes into play after that (next effective cap is a 10,000 byte limit on the size of the script). We took a more conservative route and decided to wind that back to 500 - it's interesting that we got some criticism for that when the primary criticism I think that was leveled against us was it’s dangerous to increase that limit to unlimited. We did that because we’re being conservative. We did some research into these log n squared bugs, sorry – attacks, that people have referred to. We identified a few of them and we had a hard think about it and thought - look if we can find this many in a short time we can fix them all (the whack-a-mole approach) but it does suggest that there may well be more unknown ones. So we thought about putting, you know, taking the whack-a-mole approach, but that doesn't really give us any certainty. We will fix all of those individually but a more global approach is to make sure that if anyone does discover one of these scripts it doesn't bring the node to a screaming halt, so the problem here is because the Bitcoin node is essentially single-threaded, if you get one of these scripts that locks up the script engine for a long time everything that's behind it in the queue has to stop and wait. So what we wanted to do, and this is something we've got an engineer actively working on right now, is once that script validation goad path is properly paralyzed (parts of it already are), then we’ll basically assign a few threads for well-known transaction templates, and a few threads for any any type of script. So if you get a few scripts that are nasty and lock up a thread for a while that's not going to stop the node from working because you've got these other kind of lanes of the highway that are exclusively reserved for well-known script templates and they'll just keep on passing through. Once you've got that in place, and I think we're in a much better position to get rid of that limit entirely because the worst that's going to happen is your non-standard script pipelines get clogged up but everything else will keep keep ticking along - there are other mitigations for this as well I mean I know you could always put a time limit on script execution if they wanted to, and that would be something that would be up to individual miners. Bitcoin SV's job I think is to provide the tools for the miners and the miners can then choose, you know, how to make use of them - if they want to set time limits on script execution then that's a choice for them.Daniel: 0:25:34.82,0:26:15.85
Yeah, I'd like to point out that a node here, when it receives a transaction through the peer to peer network, it doesn't have to accept that transaction, you can reject it. If it looks suspicious to the node it can just say you know we're not going to deal with that, or if it takes more than five minutes to execute, or more than a minute even, it can just abort and discard that transaction, right. The only time we can’t do that is when it's in a block already, but then it could decide to reject the block as well. It's all possibilities there could be in the software.Steve: 0:26:13.08,0:26:20.64
Yeah, and if it's in a block already it means someone else was able to validate it so…Cory: 0,0:26:21.21,0:26:43.60
Well I mean one of one of the most significant things is other than two, which are minor variants of DUP and MUL, they represent almost the complete set of original op codes. I think that's not necessarily a technical issue, but it's an important milestone. MUL is one that's that I've heard some interesting comments about. People ask me why are you putting OP_MUL back in if you're planning on changing them to big number operations instead of the 32-bit limit that they're currently imposed upon. The simple answer to that question is that we currently have all of the other arithmetic operations except for OP_MUL. We’ve got add divide, subtract, modulo – it’s odd to have a script system that's got all the mathematical primitives except for multiplication. The other answer to that question is that they're useful - we've talked about a Rabin signature solution that basically replicates the function of DATASIGVERIFY. That's just one example of a use case for this - most cryptographic primitive operations require mathematical operations and bit shifts are useful for a whole ton of things. So it's really just about completing that work and completing the script engine, or rather not completing it, but putting it back the way that it was it was meant to be.Connor 0:28:20.42,0:29:22.62
Yeah there's two parts there - the big number one and the L shift being a logical shift instead of arithmetic. so when we re-enabled these opcodes we've looked at them carefully and have adjusted them slightly as we did in the past with OP_SPLIT. So the new LSHIFT and RSHIFT are bitwise operators. They can be used to implement arithmetic based shifts - I think I've posted a short script that did that, but we can't do it the other way around, right. You couldn't use an arithmetic shift operator to implement a bitwise one. It's because of the ordering of the bytes in the arithmetic values, so the values that represent numbers. The little endian which means they're swapped around to what many other systems - what I've considered normal - or big-endian. And if you start shifting that properly as a number then then shifting sequence in the bytes is a bit strange, so it couldn't go the other way around - you couldn't implement bitwise shift with arithmetic, so we chose to make them bitwise operators - that's what we proposed.Steve: 0:31:10.57,0:31:51.51
That was essentially a decision that was actually made in May, or rather a consequence of decisions that were made in May. So in May we reintroduced OP_AND, OP_OR, and OP_XOR, and that was also another decision to replace three different string operators with OP_SPLIT was also made. So that was not a decision that we've made unilaterally, it was a decision that was made collectively with all of the BCH developers - well not all of them were actually in all of the meetings, but they were all invited.Daniel: 0:31:48.24,0:32:23.13
Another example of that is that we originally proposed OP_2DIV and OP_2MUL was it, I think, and this is a single operator that multiplies the value by two, right, but it was pointed out that that can very easily be achieved by just doing multiply by two instead of having a separate operator for it, so we scrapped those, we took them back out, because we wanted to keep the number of operators minimum yeah.Steve: 0:32:17.59,0:33:47.20
There was an appetite around for keeping the operators minimal. I mean the decision about the idea to replace OP_SUBSTR, OP_LEFT, OP_RIGHT with OP_SPLIT operator actually came from Gavin Andresen. He made a brief appearance in the Telegram workgroups while we were working out what to do with May opcodes and obviously Gavin's word kind of carries a lot of weight and we listen to him. But because we had chosen to implement the May opcodes (the bitwise opcodes) and treat the data as big-endian data streams (well, sorry big-endian not really applicable just plain data strings) it would have been completely inconsistent to implement LSHIFT and RSHIFT as integer operators because then you would have had a set of bitwise operators that operated on two different kinds of data, which would have just been nonsensical and very difficult for anyone to work with, so yeah. I mean it's a bit like P2SH - it wasn't a part of the original Satoshi protocol that once some things are done they're done and you know if you want to want to make forward progress you've got to work within that that framework that exists.Daniel: 0:33:45.85,0:34:48.97
When we get to the big number ones then it gets really complicated, you know, number implementations because then you can't change the behavior of the existing opcodes, and I don't mean OP_MUL, I mean the other ones that have been there for a while. You can't suddenly make them big number ones without seriously looking at what scripts there might be out there and the impact of that change on those existing scripts, right. The other the other point is you don't know what scripts are out there because of P2SH - there could be scripts that you don't know the content of and you don't know what effect changing the behavior of these operators would mean. The big number thing is tricky, so another option might be, yeah, I don't know what the options for though it needs some serious thought.Steve: 0:34:43.27,0:35:24.23
That’s something we've reached out to the other implementation teams about - actually really would like their input on the best ways to go about restoring big number operations. It has to be done extremely carefully and I don't know if we'll get there by May next year, or when, but we’re certainly willing to put a lot of resources into it and we're more than happy to work with BU or XT or whoever wants to work with us on getting that done and getting it done safely.Connor: 0:35:19.30,0:35:57.49
I’d actually like to repurpose the concept. I think I mentioned before multi-threaded script validation and having some dedicated well-known script templates - when you say the word well-known script template there’s already a check in Bitcoin that kind of tells you if it's well-known or not and that's IsStandard. I'm generally in favor of getting rid of the notion of standard transactions, but it's actually a decision for miners, and it's really more of a behavioral change than it is a technical change. There's a whole bunch of configuration options that miners can set that affect what they do what they consider to be standard and not standard, but the reality is not too many miners are using those configuration options. So I mean standard transactions as a concept is meaningful to an arbitrary degree I suppose, but yeah I would like to make it easier for people to get non-standard scripts into Bitcoin so that they can experiment, and from discussions of I’ve had with CoinGeek they’re quite keen on making their miners accept, you know, at least initially a wider variety of transactions eventually.Daniel: 0:37:32.85,0:38:07.95
So I think IsStandard will remain important within the implementation itself for efficiency purposes, right - you want to streamline base use case of cash payments through them and prioritizing. That's where it will remain important but on the interfaces from the node to the rest of the network, yeah I could easily see it being removed.Cory: 0,0:38:06.24,0:38:35.46
Well in November there's going to be a divergence of consensus rules regardless of whether we implement these new op codes or not. Bitcoin ABC released their spec for the November Hard fork change I think on August 16th or 17th something like that and their client as well and it included CTOR and it included DSV. Now for the miners that commissioned the SV project, CTOR and DSV are controversial changes and once they're in they're in. They can't be reversed - I mean CTOR maybe you could reverse it at a later date, but DSV once someone's put a P2SH transaction into the project or even a non P2SH transaction in the blockchain using that opcode it's irreversible. So it's interesting that some people refer to the Bitcoin SV project as causing a split - we're not proposing to do anything that anyone disagrees with - there might be some contention about changing the opcode limit but what we're doing, I mean Bitcoin ABC already published their spec for May and it is our spec for the new opcodes, so in terms of urgency - should we wait? Well the fact is that we can't - come November you know it's bit like Segwit - once Segwit was in, yes you arguably could get it out by spending everyone's anyone can spend transactions but in reality it's never going to be that easy and it's going to cause a lot of economic disruption, so yeah that's it. We're putting out changes in because it's not gonna make a difference either way in terms of whether there's going to be a divergence of consensus rules - there's going to be a divergence whether whatever our changes are. Our changes are not controversial at all.Daniel: 0:40:39.79,0:41:03.08
If we didn't include these changes in the November upgrade we'd be pushing ahead with a no-change, right, but the November upgrade is there so we should use it while we can. Adding these non-controversial changes to it.Connor: 0:41:01.55,0:41:35.61
Can I say one or two things about this – there’s different ways to look at that, right. I'm an engineer - my specialization is software, so the economics of it I hear different opinions. I trust some more than others but I am NOT an economist. I kind of agree with the ones with my limited expertise on that it's a subsidy it looks very much like it to me, but yeah that's not my area. What I can talk about is the software - so adding DSV adds really quite a lot of complexity to the code right, and it's a big change to add that. And what are we going to do - every time someone comes up with an idea we’re going to add a new opcode? How many opcodes are we going to add? I saw reports that Jihan was talking about hundreds of opcodes or something like that and it's like how big is this client going to become - how big is this node - is it going to have to handle every kind of weird opcode that that's out there? The software is just going to get unmanageable and DSV - that was my main consideration at the beginning was the, you know, if you can implement it in script you should do it, because that way it keeps the node software simple, it keeps it stable, and you know it's easier to test that it works properly and correctly. It's almost like adding (?) code from a microprocessor you know why would you do that if you can if you can implement it already in the script that is there.Steve: 0:43:36.16,0:46:09.71
It’s actually an interesting inconsistency because when we were talking about adding the opcodes in May, the philosophy that seemed to drive the decisions that we were able to form a consensus around was to simplify and keep the opcodes as minimal as possible (ie where you could replicate a function by using a couple of primitive opcodes in combination, that was preferable to adding a new opcode that replaced) OP_SUBSTR is an interesting example - it's a combination of SPLIT, and SWAP and DROP opcodes to achieve it. So at really primitive script level we've got this philosophy of let's keep it minimal and at this sort of (?) philosophy it’s all let's just add a new opcode for every primitive function and Daniel's right - it's a question of opening the floodgates. Where does it end? If we're just going to go down this road, it almost opens up the argument why have a scripting language at all? Why not just add a hard code all of these functions in one at a time? You know, pay to public key hash is a well-known construct (?) and not bother executing a script at all but once we've done that we take away with all of the flexibility for people to innovate, so it's a philosophical difference, I think, but I think it's one where the position of keeping it simple does make sense. All of the primitives are there to do what people need to do. The things that people don't feel like they can't do are because of the limits that exist. If we had no opcode limit at all, if you could make a gigabyte transaction so a gigabyte script, then you can do any kind of crypto that you wanted even with 32-bit integer operations, Once you get rid of the 32-bit limit of course, a lot of those a lot of those scripts come up a lot smaller, so a Rabin signature script shrinks from 100MB to a couple hundred bytes.Daniel: 0:46:06.77,0:47:36.65
I lost a good six months of my life diving into script, right. Once you start getting into the language and what it can do, it is really pretty impressive how much you can achieve within script. Bitcoin was designed, was released originally, with script. I mean it didn't have to be – it could just be instead of having a transaction with script you could have accounts and you could say trust, you know, so many BTC from this public key to this one - but that's not the way it was done. It was done using script, and script provides so many capabilities if you start exploring it properly. If you start really digging into what it can do, yeah, it's really amazing what you can do with script. I'm really looking forward to seeing some some very interesting applications from that. I mean it was Awemany his zero-conf script was really interesting, right. I mean it relies on DSV which is a problem (and some other things that I don't like about it), but him diving in and using script to solve this problem was really cool, it was really good to see that.Steve: 0:47:32.78,0:48:16.44
I asked a question to a couple of people in our research team that have been working on the Rabin signature stuff this morning actually and I wasn't sure where they are up to with this, but they're actually working on a proof of concept (which I believe is pretty close to done) which is a Rabin signature script - it will use smaller signatures so that it can fit within the current limits, but it will be, you know, effectively the same algorithm (as DSV) so I can't give you an exact date on when that will happen, but it looks like we'll have a Rabin signature in the blockchain soon (a mini-Rabin signature).Cory: 0:48:13.61,0:48:57.63
I think you've got a factor in what I said before about the philosophical differences. I think new functionality can be introduced just fine. Having said that - yes there is a place for new opcodes but it's probably a limited place and in my opinion the cryptographic primitive functions for example CHECKSIG uses ECDSA with a specific elliptic curve, hash 256 uses SHA256 - at some point in the future those are going to no longer be as secure as we would like them to be and we'll replace them with different hash functions, verification functions, at some point, but I think that's a long way down the track.Daniel: 0:49:42.47,0:50:30.3
I'd like to see more data too. I'd like to see evidence that these things are needed, and the way I could imagine that happening is that, you know, that with the full scripting language some solution is implemented and we discover that this is really useful, and over a period of, like, you know measured in years not days, we find a lot of transactions are using this feature, then maybe, you know, maybe we should look at introducing an opcode to optimize it, but optimizing before we even know if it's going to be useful, yeah, that's the wrong approach.Steve: 0:50:28.19,0:51:45.29
I think that optimization is actually going to become an economic decision for the miners. From the miner’s point of view is if it'll make more sense for them to be able to optimize a particular process - does it reduce costs for them such that they can offer a better service to everyone else? Yeah, so ultimately these decisions are going to be miner’s main decisions, not developer decisions. Developers of course can offer their input - I wouldn't expect every miner to be an expert on script, but as we're already seeing miners are actually starting to employ their own developers. I’m not just talking about us - there are other miners in China that I know have got some really bright people on their staff that question and challenge all of the changes - study them and produce their own reports. We've been lucky with actually being able to talk to some of those people and have some really fascinating technical discussions with them.