Crypto Long & Short: Coinbase’s ‘Apolitical’ Stance Isn’t Nearly as Simple as It Sounds

As if the ructions of the year aren’t giving us enough cause to re-examine things we thought we understood, now we find ourselves questioning what a company is for, and what role it should occupy in society and in employees’ lives. 

Earlier this week, Coinbase co-founder and CEO Brian Armstrong published a post in which he stressed the company’s focus on the mission of creating “an open financial system for the world,” and asked that political issues be left out of workplace discourse. 

The questions this raises are huge, and the timing fits right into tectonic shifts already underway in the role of capitalism in our evolving society.

Let’s look at some of the questions, to which there are no clear answers.

  1. Armstrong says Coinbase has “an apolitical culture.” What does that even mean, in these times of growing polarization on practically everything? Even being apolitical can be taken as a political stance. What’s more, when a company whose mission is to bring “economic freedom to people all over the world” requests that activism and politics be left at the door, you get a glimpse of how institutionalized the crypto ethos is becoming. 
  2. What is an employment contract? Some will answer that it is monetary compensation for certain output. Others will argue that you give up your time in exchange for payment. If the latter, can the organization paying you dictate what you do in that time?
  3. Does a company have the right to define its own mission? The answer might seem like an obvious yes, but an extension of that is, does a company have the right to ignore topics its employees care about? Here the issue gets more divisive.
  4. Related to the previous point, is a company’s responsibility to its shareholders or its employees? Armstrong believes that focus is core to achieving the mission, and that is what shareholders have a right to expect. But the success of intelligence-based businesses largely rests on the employees. We’re not talking about widget-producing factory floors here. This is an environment in which specialized talents and inspiration matter, and those are supplied by motivated people. So, some could argue that Armstrong’s responsibility is to his employees, because that will make the company more profitable and the shareholders happy. 

There are many more, but I’m aware of pixel constraints.

As if to drive home the point, this week IBM released the results of its annual executive survey. Here’s an excerpt from the press release (my emphasis): 

“Ongoing IBV [IBM’s Institute for Business Value] consumer research has shown that the expectations employees have of their employers have shifted amidst the pandemic – employees now expect that their employers will take an active role in supporting their physical and emotional health as well as the skills they need to work in new ways.”

This is at odds with a focus on the “mission,” whatever that mission may be. And it highlights the crucial role that employees play in a firm’s success. Also from the PR:

“Participating businesses are seeing more clearly the critical role people play in driving their ongoing transformation.”

This doesn’t come from some new-wave, millennial-driven, holistic social advocate. It comes from IBM, a standard bearer for legacy enterprise, and represents how much the concept of efficient management has changed. 

Whether you agree or disagree with Armstrong’s position, you have to admit he was brave to wade into this, especially given the rumors of a planned public listing later this year.

Armstrong’s blog post is so much more than a corporate policy statement. It is likely to spark uncomfortable questions as employees seek clarification from companies struggling to navigate through issue-driven minefields. It could lead to a re-evaluation of the concept of a “social contract” between employer and employee, and whether the implicit understanding needs codifying. It could even end up being a trigger for a battle for the soul of corporations, and the meaning of value.

These are difficult times, in more ways than we can possibly realize. And the coming change in mores and expectations will be deeper than most anticipate. 

BitMEX had a really bad day

The U.S. Commodity Futures Trading Commission (CFTC) and federal prosecutors have started the quarter off with a bang, charging crypto trading platform BitMEX with facilitating unregistered trading and other violations, and arresting co-founder Samuel Reed. 

This is a big deal, as BitMEX is one of the industry’s largest trading platforms. In 2016, it introduced a derivative known as perpetual swaps (futures that don’t expire) to the market, with up to 100x leverage, and for many years was the market leader in terms of derivative volume and open interest.  This is an example of how market infrastructure can affect prices in a young asset class. In 2014, Mt. Gox – then the largest bitcoin exchange with approximately 70% of market share – collapsed, revealed a gaping hole where custodied bitcoin should have been. The bitcoin (BTC) price dropped by almost 50%, recovered a bit and then fell even further over the next few months. It took over two years to recover from the confidence blow. 

As recently as a couple of years ago, BitMEX was the largest derivatives exchange, and this week’s news could have had a similar effect given the relatively high leverage in its contracts. Yet the BTC price initially fell almost 4% on the news, which is not insignificant, but nowhere near the systemic jolt many expected. It then recovered 1.5% before being blindsided by other market-shaking non-crypto-related news.

In other words, BitMEX’s run-in with the law will have an impact, but it is unlikely to be material.

In recent months, BitMEX lost its dominant position to OKEx, Huobi and Binance, and now ranks fourth in terms of daily volume and second in terms of open interest. Even if BitMEX ends up closing, the market repercussions will be felt, but will not be systemically damaging, as there are alternative trading venues. 

skew_exchange_btc_futures_open_interest_bn-2
Source: skew.com
skew_exchange_24h_btc_futures_volumes_bn-1
Source: skew.com

What’s more, while the domain name could be seized and withdrawals impeded (the exchange requires three of the four authorized signatories to approve withdrawals, and so far one has been arrested), BitMEX is unlikely to close – at time of writing, withdrawals were proceeding without hitch, and were significant but not catastrophic for the exchange. 

glassnode-studio_bitcoin-total-transfer-volume-from-exchanges-bitmex
Source: glassnode

Even more importantly, this news does not change the fundamentals of bitcoin. It may affect trading volumes as positions are closed and reopened elsewhere. But the underlying technology and the potential use case remain intact.

And, rather than weaken confidence in crypto market infrastructure, this news is likely to enhance it. One of the reasons cited by the SEC for its rejection of all bitcoin ETF proposals so far is the lack of surveillance on significant offshore exchanges. This action by the CFTC feels like part of a “bring out the broom” initiative that will improve the rigor and oversight of market players, which should boost institutional confidence and product range. It could even be a tentative step towards a bitcoin ETF approval.

3 things from Q3

As we are now into the final stretch of what has been a spectacularly tumultuous year, it’s time to look back at a few of the recent developments in crypto asset markets that I find particularly interesting. There are so many to choose from, as the speed of progress has been astonishing. Our CoinDesk Quarterly Review 2020 Q3, which dives into some of the main market drivers, is out on Monday – keep an eye out for it in our Research Hub.

1)    Stablecoins were the breakout protagonist in terms of market activity, and not just in terms of market cap growth. Earlier in Q3 the on-chain transfer value of fiat-backed stablecoins passed that of bitcoin (BTC) for the first time. While there are many factors at play here, this does indicate a growing reliance on stablecoins as the industry’s settlement token. 

stablecoin-trading-vol-vs-btc

Fiat-backed stablecoins now have a much higher transaction volume than either BTC or ETH
Source: Coin Metrics

2)    The value that has flowed into decentralized finance (DeFi) applications has astounded even those of us who work in the industry. I don’t talk much about DeFi in this newsletter, since it has so far been very niche and, well, untested. But it’s starting to affect the markets I do focus on. While volumes have exploded (not literally, obviously, and it says a lot about the mood this year that I even have to clarify that), they are still small in terms of comparative market size. What is telling is the interest that centralized platforms such as crypto exchanges are starting to pay this area. And not just centralized platforms: At an event earlier this week, Brian Brooks, acting head of the US Office of the Comptroller of the Currency (OCC), said that he believes that traditional financial institutions will have embraced DeFi technology and principles within 10 years. I agree, and given the increasingly frequent signs this process is starting, you’ll probably start to hear more about DeFi in these columns.

value-locked-in-defi

The amount deposited in DeFi contracts has multiplied > 5x over the past few months
Source: DefiPulse

Perhaps you have already been following the DeFi space, because you are interested in unusual yield opportunities, or because you enjoy the wacky packaging some of these applications come in (many of which are named after food, don’t ask). If not, and you’d like to start to get ahead of the curve, here’s a good introduction. 

3)    Bitcoin’s dominance of the crypto asset market has continued its decline. Five years ago, bitcoin was virtually all of the crypto asset market. Then came the 2017 ICO boom with a flood of new tokens surging in value, and bitcoin’s dominance fell to a low of 36%. As the bubble burst, most of the new tokens fell in value, eventually restoring bitcoin’s dominance to around 70%. 

The dominance (as measured by TradingView’s BTC Dominance Index) has been steadily falling since around May of this year, largely due to the surge in the market cap of stablecoins and to the growth in DeFi tokens, not all of which were spurious memes. 

btc-dominance-with-price

BTC market cap dominance is trending down, even though the price is trending up
Source: TradingView

Note that the index is trending downwards in spite of the upward trend in prices, which speaks to the level of growth elsewhere in crypto markets.

In other words, this is less to do with weakness in bitcoin and more to do with the expansion of the industry overall. That, in turn, is positive for bitcoin which, for many, will be the gateway crypto asset, the one that investors try out first. 

Anyone know what’s going on yet?

Bitcoin yet again exhibited its split personality this week. I had a chart all ready to share with you that showed that its correlation to gold had been heading up for most of the quarter – and then Trump’s positive COVID test results sent gold higher while bitcoin headed lower. True, bitcoin had already had a shock earlier that day from the BitMEX indictment, and the slump could well have been continuing jitters from that. But it’s not unreasonable to expect market-shaking news like the President of the United States possibly being seriously ill (as far as we know, he only has light symptoms so far) to spark a rush to safety. It seems that the market is not yet convinced that bitcoin is a “safe haven” like its analog comparison.
   
Trump’s COVID test result seemed to have more of an impact on markets than Tuesday night’s debate, which says a lot about the debate’s inefficacy in moving the needle on divided allegiances. Zooming out, this is bewildering considering what its viciousness said about American democracy, and the importance of the election outcome. Unless, of course, the outcome of the election isn’t important at all? Like I said, bewildering. 

performance-chart-100220-wide

Bitcoin had a weak September (-8.4%) and has not exactly started off on a good foot in October. It did, however, achieve a positive record: it has closed above the $10,000 mark for its longest streak of 66 days and counting. This is significant inasmuch as this long a stretch above that psychological barrier hints that $10,000 has become the new price floor. Of course, floors have been broken before … 

CHAIN LINKS

Cryptocurrency exchange Bitfinex has started trading perpetual contracts that track two European equity market indices and settle in the stablecoin tether. TAKEAWAY: You’ve often heard me talk about how I believe crypto assets will have a profound impact on traditional capital markets. Here is an example of how it will happen: We have a crypto exchange offering a derivative developed for the crypto markets to bet on movements in traditional indices. And to top it all off, it settles, not in fiat but in a fiat-backed stablecoin. Another notable aspect is the leverage – 100x is insanely risky, and is a feature largely limited to crypto exchanges. Few traders avail themselves of that much risk, however, as experienced market professionals know that it’s not wise. 

The spread between the six-month implied volatility (IV) for ether (ETH) and bitcoin (BTC), a measure of expected relative volatility between the two, fell to a 2.5-month low of 4% over the weekend, according to data source Skew. TAKEAWAY: This could mean that traders expect ETH to act more like BTC going forward. The ETH futures market is still immature, however, and the signals are not yet that reliable. 

iv-btc-eth
Source: skew.com

Arjun Balaji of Paradigm wrote an excellent overview of crypto asset market progress over the past two years, with a look at what needs to happen next: principally, major improvements in capital efficiency (which is gearing up with the emergence of institutional-grade prime brokerage and crypto-native repo, among other features), and the convergence of decentralized and centralized financial functions. TAKEAWAY: I totally agree, and hats off to Arjun for putting it all so succinctly. I have two needed developments to add: greater regulatory clarity on what is and isn’t a security, to encourage innovation in investment and saving opportunities for a broader range of people; and new rules to smooth the way for the new types of securities to list and trade in a compliant manner (the INX token is a start, but it’s just scratching the surface).  On a similar theme, Jill Carlson wrote an op-ed for CoinDesk that talks about how recent focus has been on innovation in crypto asset infrastructure, and how the pendulum may soon swing back to emphasize innovation in assets. TAKEAWAY: Robust infrastructure is essential for a thriving market that can attract significant levels of investor interest. But investors don’t enter our industry for the infrastructure, they do so for the assets. The pendulum that Jill refers to seems to have already begun its swing – we can see this not so much in the meme-infused DeFi assets, but more in the SEC-registered INX token that gives holders trading advantages and a share in net cash flow, and in SEC Chairman Clayton confirming that the U.S. regulator would consider authorizing a tokenized ETF (one presumably not based on crypto assets, for now). 

An amended filing with the Securities and Exchange Commission (SEC) last week showed that Bitwise’s Bitcoin Fundhas raised just under $8.9 million, more than double the amount it had raised last year. TAKEAWAY: According to Bitwise’s head of research, Matthew Hougan, this is largely because of growing concern over runaway inflation. Given the new Federal Reserve policy of allowing inflation to overshoot targets (the ECB this week hinted it will follow suit), these concerns are likely to intensify.

The Atari Group, the company behind such classic video games as Pac-Man and Pong, will begin publicly selling its Atari Token (ATRI) cryptocurrency in early November. TAKEAWAY: This ERC-20 token will be used in crypto casinos, blockchain-based games and the company’s video game distribution platform. I’m not clear on the economics behind the token, but the combination of Atari, games and tokens does sound a bit like a door to a mainstream use case. But I’m not a gamer, so I might be wrong. (Speaking of which, anyone see the Netflix documentary series “High Score”? Excellent.)

Nasdaq-listed mining equipment manufacturer Ebang reported a revenue slump in 2020 H1 of over 50% from the same period in 2019. According to the company, this was largely due to pandemic-related supply chain disruptions. TAKEAWAY: Supply disruptions are no doubt part of it, but as my colleague Matt Yamamoto pointed out in this report, Ebang’s product mix was inferior to that of its competitors anyway. You can’t blame COVID-19 for everything.

CoinDesk Research has a new report out, authored by my colleague Matt Yamamoto, on Silvergate Bank, which looks at its financials and its business strategy in the light of growing competition. 

Podcast episodes worth listening to:

And a reminder carried over from last week that CoinDesk as not one but three new podcast series that are definitely worth checking out and subscribing to:

  • Money Reimagined, with Michael Casey and Sheila Warren of the WEF – for the first episode, they talk to multimedia artist Nicky Enright and University of Virginia Media Studies Professor Lana Swartz
  • Borderless, with Nik De, Anna Baydakova and Danny Nelson, which covers trends impacting crypto adoption around the world
  • Opinionated, with Ben Schiller – for the first episode, he interviews Nic Carter, CoinDesk columnist and partner of Castle Island Ventures
Disclosure

How Among Us Came Back From the Brink of Obscurity

The ship had already sailed on Among Us.

The cloak-and-dagger party game, fashioned in the tradition of classic social deduction activities like Mafia and Werewolf, was initially released in the summer of 2018 exclusively for Androids and iPhones to a winsome, but mostly neglected response. The gameplay was always solid but, for whatever reason – be it the exclusively to mobile platforms, the local-only multiplayer, or the limited marketing abilities of developer Innersloth’s small team – Among Us didn’t catch on with the general gaming public. By the time it hit Steam a year later, with online multiplayer patched in, nobody expected it to become a runaway success.

Before Innersloth knew it, they had the hottest game on the planet.


Fast forward to 2020, however, and Among Us has quickly established itself as one of the preeminent party games on Twitch and YouTube. After an entirely inauspicious launch, Among Us has now racked up over 75,000 reviews on Steam, and is averaging hundreds of thousands concurrent players around the clock. This isn’t a new story; indie games go viral all the time – just look at Surgeon Simulator, Escape From Tarkov, or Getting Over It. But it is rare for a title to sit dormant for multiple years before inexplicably catching fire. Before Innersloth knew it, they had the hottest game on the planet.

“The first thing we really noticed was a Twitch stream from Sodapoppin,” says Forest Willard, programmer and co-founder of Innersloth. “We had various moments where we were like, ‘We’re doing well,’ but it was that point where we saw that a lot of people and other streamers started to climb onboard.”

There are three people credited for the development of Among Us. Willard was the primary coder, Marcus Bromander served as the animator and designer, and Amy Liu handled the lion’s share of the art. For the last two years since release, Willard says he was pretty much the only person working on day-to-day operations. That wasn’t an especially difficult job; after all, it’s not like Among Us had a huge player base demanding his oversight. So it was a shock to boot up the game’s backend after Sodapoppin’s endorsement to find that Among Us had 10,000-plus players attempting to barge into a server at the same time. Willard was the only intermediary to make sure the game’s infrastructure survived.

“[Among Us] couldn’t handle that server load. You have to re-do code and re-do systems so you can get more servers onboard. It was really overwhelming,” he remembers. “It was just, 12-hour days, nose-to-the-grindstone until you get it done.”

Games That Came Back From the Brink of Disaster

When I first reported this story, Innersloth said their long-term goal is to build a standalone Among Us sequel that could host the original version of the game within it. That was a little bit surprising. These days, when a studio has a hit, they tend to iterate on the original product with a constant stream of patch notes filled with new maps, characters, and often-experimental game modes. In that sense, Willard and the team were pushing against the grain. Among Us is built on old tech, he says; it began as a mobile party game with no online multiplayer and mutated into a Steam monster. At this point, iterating on the game’s core infrastructure is an onerous task. “Making any changes is really scary. It’s hard to test all the pieces that were accidentally connected together,” says Willard. “We feel like we have enough things we want to add to Among Us 2 that it deserves to be its own thing.”

However, in the weeks since I spoke to Willard, Innersloth announced they would be canning the plans for a sequel entirely. Instead, all of the plans they’ve sketched out for Among Us 2 would be integrated into the primitive code, no matter how much of a challenge that might be. Willard reminds me that all of that work is very early in its development process, and some of its new flourishes have barely begun to germinate in Innersloth’s braintrust. But one truth is undeniable; Among Us has a bright future ahead of it. All that’s left to see is how the team takes advantage of the moment.

Luke Winkie is a writer and former pizza maker in Brooklyn. He’s written for Vox, Vice, The New York Times, Gizmodo, PC Gamer, The Atlantic, Rolling Stone, and wherever else good content can be found.

Source

Confessions of a Sharding Skeptic

With final preparations for the launch of Ethereum 2.0 soon to be underway, CoinDesk’s Christine Kim spoke to Cayman Nava, technical lead at ChainSafe Systems and Alexey Akhunov, an independent researcher and software developer about the kinks in ETH’s evolution that still need to be worked out.

For free, early access to new episodes of this and other CoinDesk podcasts subscribe to CoinDesk Reports with Apple Podcasts, Spotify, Pocketcasts, Google Podcasts, Castbox, Stitcher, RadioPublica, iHeartRadio or RSS.

This episode is sponsored by Crypto.comNexo.io and Elliptic.co.

The Ethereum blockchain processes about three to four times as many transactions as Bitcoin. It’s still not enough, however, to meet rising user demand for the cryptocurrency and prevent network congestion.  

See also: DeFi Frenzy Drives Ethereum Transaction Fees to All-Time Highs

One of the most highly anticipated fixes to Ethereum’s transaction bottleneck and its lack of scalability is an ambitious software upgrade called Ethereum 2.0. According to Vitalik Buterin, the creator of Ethereum, Ethereum 2.0 will boost network speeds from around 15 transactions per second (TPS) to 100,000 TPS.  

How? The solution is sharding. Cayman Nava, technical lead at ChainSafe Systems, explains sharding as “a natural way to break things up.” 

“If you’re wanting to process a lot of data but you don’t want any one party to be overloaded with that data, you can naturally think of breaking up your problem into smaller pieces,” said Nava. These “smaller pieces” Nava is referring to are called shards. In Ethereum 2.0, 64 shards will be created to break up the transaction load of Ethereum. 

See also: Ethereum 2.0: How It Works and Why It Matters

While sharding sounds effective in theory, there are other Ethereum developers who are skeptical about the benefits of this technique in practice. 

“If I were to design scaling [for Ethereum], first I would squeeze as much as possible out of Ethereum 1, which I think hasn’t been done yet, and then after that I would actually introduce sharding logically in order to see whether users would actually be able to use [sharding] effectively,” said Alexey Akhunov, an independent researcher and software developer for Ethereum that has been contributing code to the network’s development since 2016

Sharding logically refers to breaking up data within the same blockchain as opposed to sharding physically, which necessitates the creation of multiple mini-blockchains. As mentioned, Ethereum 2.0 will spawn a physically sharded system of 64 linked databases. Optimizing the communication between shards in this environment, Akhunov goes on to explain, may pose an even greater challenge to network scalability than a transaction bottleneck.  

Nava agrees there are kinks and holes in the design of Ethereum 2.0 and its sharded system that need to be worked out. But in Nava’s view, these problems that call for further detailing and research can be delayed in the short term while developers work toward an upgrade launch. 

“I think we can delay these harder problems like how sharding should work or what it should look like. That can be pushed off a little bit so we can think about it and get it right. In the near term, we can get a lot of the benefits from the [Ethereum 2.0] work that we’ve been doing,” said Nava. 

To download or stream the full podcast episode with Akhunov and Nava you can go toApple Podcasts, Spotify, Pocketcasts, Google Podcasts, Castbox, Stitcher, RadioPublica, iHeartRadio or RSS.. For early access to future CoinDesk Research podcast episodes, be sure to click “subscribe” on these channels. 

For more information about Ethereum 2.0, you can download the free research report featuring additional developer commentary about the upgrade on the CoinDesk Research Hub. 

Disclosure

Indiana man who lost eye to tear gas canister sues police

A man who lost an eye after being struck by a tear gas canister police in Indiana fired during a May protest over George Floyd’s death is suing the city and a police officer

FORT WAYNE, Ind. — A man who lost an eye after being struck by a tear gas canister police in Indiana fired during a May protest over George Floyd’s death is suing the city and a police officer.

Balin Brake, 21, of Fort Wayne, Indiana, contends in a federal lawsuit filed Friday in U.S. District Court that the injury “has permanently changed his life” and led to mounting medical bills.

He is suing the city of Fort Wayne and the unidentified officer who fired the canister, alleging his constitutional rights were violated and that police used excessive force during protests in the city’s downtown, The Journal Gazette reported.

City spokesman John Perlich declined to comment on the pending litigation. It stems from a May 30 protest where hundreds of people gathered near the Allen County Courthouse to protest George Floyd’s death after a Minneapolis police officer pressed his knee on the handcuffed Black man’s neck for several minutes.

According to the complaint filed by the American Civil Liberties Union of Indiana and a Chicago law firm, Brake was peacefully protesting when a Fort Wayne Police Department officer fired a tear gas canister that struck him in his right eye, rupturing it.

Doctors at a local hospital were forced to remove Brake’s eye, according to the suit, which seeks damages for Brake’s injuries, punitive damages against the officer and a judge’s order declaring that the city violated Brake’s First and Fourth Amendment rights.

“The right to protest is fundamental to our democracy and no one should face tear gassing and injury while exercising that right,” said Jane Henegar, executive director at the ACLU of Indiana.

Fort Wayne police said in a May 31 statement that a protester had stayed in the area following police commands for them to clear out. When tear gas was used that protester bent over to pick up a canister to throw back at police, according to the state from police spokeswoman Sofia Rosales-Scatena.

“When he bent over, another canister was deployed in the area and that canister skipped and hit the protester in the eye,” Rosales-Scatena wrote. “There was no deliberate deployment of gas to any person’s head.”

Brake has denied reaching for a canister to throw and said he didn’t hear warnings to leave.

Source

Genshin Impact: How to Earn Wishes to Get Weapons and Characters

Last Edited:

Genshin Impact is a gacha game slash JRPG, and while our walkthrough will help you with the JRPG part and the gameplay elements, the gacha part relies on Wishes. Through pulling cards, you can unlock new characters to play or new weapons to use, so they’re more useful than just the aesthetic. Here, we’ll go through how the currency works, and how to make the most of what the game gives you.

What Is The Currency In Genshin Impact?

While the menu is called ‘Wishes’, you actually buy the packs with Acquiant Fate and Intertwined Fate, which are similar but different. Intertwined Fate is rarer, and so comes with better rewards when you save enough of them up, but is harder to earn in the game and has a higher price if you’re planning on just buying them outright.

How Do You Get Characters Or Weapons In Genshin Impact?

Because of the gacha elements, the rewards are randomized. However, at the bottom of the Wishes screen, you’ll be able to click on a details box which will bring up a full list of the rewards and odds for each character or weapon. To even get to this screen though, you’ll need to go to either the Pause Menu and Wishes on the PS4, or click the star in the top corner for mobile and PC.

This brings up the Wishes screen, where you’ll be able to slide through the different banners and decide which ones you want to try for. In the bottom right corner, it will tell you which currency these use. It will also give you the option of buying just one, or buying 10. As we explain below, it’s much better to save up for 10 then buy them one at a time. Either way though, from here the game will give you a randomized reward of either a character or a weapon.

Should I Save Up My Fates In Genshin Impact?

Absolutely. While you might get lucky buying rewards one by one, buying them in bulk guarantees at least one high level pull, and may have a discount; the specifics of this depend on which banner you’re pulling, but the basic gist is consistent. It’s definitely tempting to try and game the system by buying one by one and hoping to only have to spend three or four to get a high level card, but the odds just aren’t in your favor. Waiting for the bulk buy consumes both time and resources, but it’s more likely to be a successful strategy.

How Do You Get Acquiant Fates and Intertwined Fates In Genshin Impact?

The easy answer, of course, is with money. As with all gacha games, you can just keep buying Fates and spending them until you’ve got the cards you want. You can also gain them in the game by leveling up or as rewards from the Adventurer’s Guild, or by converting other in-game rewards into Fates.

advertisement

The game will give you 10 Acquiant Fates for reaching Level 5 and 10 for reaching Level 10, meaning you can guarantee two high level pulls by spending each set all at once. It takes less than half an hour to reach Level 5, so if you want to know how to reroll in Genshin Impact, we’ve got you covered. 

Source

Here’s Why Netflix Cancels Shows So Quickly Now

When Netflix first started airing original series, the site only had a handful to its name and there was actually a mild running joke about how the streaming giant didn’t cancel things. In fact, aside from Eli Roth’s Hemlock Grove, which still lasted three seasons, the site’s acclaimed shows House of Cards and Orange is the New Black both lasted six and seven seasons, respectively.Things changed for the company a few years later when the heavily-hyped and massively-produced Marco Polo got an unceremonious axe after two seasons.Then the hammer fell on Bloodline, The Get Down, The OA, and many more – to the point now where, if you weren’t Stranger Things, which is Netflix’s biggest breakout hit of all time, you probably wouldn’t go more than two or three seasons. Four, if you’re very lucky. Sure, the numbers are great for recent entries The Witcher and Umbrella Academy (which still hasn’t gotten a Season 3 renewal, by the way), but once those numbers start to decline our flatten, in the slightest, it could be curtains.

In 2020, Netflix canceled Altered Carbon, I Am Not Ok With This, The Dark Crystal: Age of Resistance, V Wars, Messiah, and many more while also announcing final seasons for Ozark, The Chilling Adventures of Sabrina (a huge initial hit), Dead to Me, and The Crown.

Obviously, there are a few logical reasons for why Netflix now seems to cut shows’ lives extremely short. One is that they have far more original series than any other studio so it stands to reason they’d have more cancellations. But Netflix, who barely promotes most of its shows as dozens land per month on the site with little to no heralding, also doesn’t seem to be at all invested in giving shows a chance to grow. A recent Wired article, however, digs a bit deeper into why the biggest streaming service in the game is now in the business of pulling the rug out from most of its shows after only a couple seasons.

Plainly put, the first reason a Netflix show gets canceled is a traditional one. It’s “based on a viewership versus cost of renewal review process, which determines whether the cost of producing another season of a show is proportionate to the number of viewers that the show receives.” This is like any other streaming service or network, really. But this is also where we, no matter how much we love a show, or realize the fervor if the show’s fandom, have to take Netflix’s word for it because as a company they don’t release ratings figures.

The second way Netflix decides if a show will continue is based on some viewership data points. Specifically, it “looks at two data points within the first seven days and first 28 days of a show being available on the service. The first is ‘Starters’, or households who watch just one episode of a series. The second data point is ‘Completers’, or subscribers who finish an entire season.”

The Best Horror Movies on Netflix

So the bulk of Netflix’s decision making is based on data from the first month of the show season’s life. It’s crucial. The final metric is Watchers, which “is the total number of subscribers who watch a show.” Netflix, which employs a “cost-plus model, which means that it pays a show’s entire production costs, plus a 30 per cent premium on top” is even even more wary when it comes to possibly losing money. Despite its reputation, of course, for throwing gobs of cash at the likes of Ryan Murphy ($300 million for five-years), Adam Sandler (most recently $275 million for four more movies), and Chris Rock ($40 million for two specials).

But, as Tom Harrington, an analyst at Enders Analysis, states, shows on Netflix “are more expensive after season two and even more expensive after season three, with the premiums going up each season.”

“They have to give [a show] more money per series, and if they decide to recommission it, it becomes more expensive for them to make,” he says. “Because of that, so many more shows are cancelled after two series [seasons] because it costs them more.”

Now here’s one more thing to consider, and its totally tethered to the subscription streaming model. As Deadline explains, “if a show hasn’t grown significantly in popularity over seasons two or three, then Netflix thinks that it’s unlikely to gain any new viewers.” So when a show stops growing, in viewers and/or pulling in new subscribers — and that doesn’t necessarily mean dropping, it can just mean plateauing — then Netflix doesn’t see a reason to keep it. So a show could be acceptably popular, and hold a large fanbase, but if it’s stopped its initial swell, and doesn’t bring in new eyes, it’ll be gone.

The Best Netflix Original Movies and TV Shows

Matt Fowler is a writer for IGN and a member of the Television Critics Association. Follow him on Twitter at @TheMattFowler and Facebook at Facebook.com/MattBFowler.Source

The Web Wasn’t Built For Privacy – But It Could Be

Privacy means different things to different people. To some, it’s secrecy. To others, it means anonymity. To some others, it’s associated with criminality. 

But privacy is really about power. 

When the web was invented, its openness was key. “The dream behind the Web is of a common information space in which we communicate by sharing information,” Tim Berners-Lee, inventor of the World Wide Web, wrote in 1997. “Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished.”

That openness encouraged people around the world to move their lives, in part, online. And with it, their data, identity, financial information and other key components of their lives. The global pandemic has only increased that data inertia. Now, that information has leaked from our grasp, and is under the purview of nation-states, bad actors, advertisers, social media giants and others.

This essay is part of CoinDesk’s “Internet 2030” series.

The old saying goes “no one knows you’re a dog on the internet.” But at this point, centralized authorities not only know you’re a dog, but also what breed you are, what your favorite kibble is, and whether you’ve been microchipped. More often than not, it’s because you told them. 

See also: Startup Aleo Wants to Help You Use the Internet Without Sacrificing Data Privacy

Our ideas of privacy used to begin with the idea of our physical body, but such a boundary no longer makes sense. The internet is everywhere and the lines between our bodies and the internet are getting blurrier and blurrier, notes Amy Zalman, part-time professor at Georgetown University and CEO of the foresight consultancy Prescient. The boundary is blurred by how we consent to data being shared and how we give up data to connected devices like video doorbells or smart locks. 

“Our devices are not just connected to the internet, but each other, and the institutions we want privacy from,” said Zalman. “We want privacy from those institutions penetrating us and slicing and dicing us up and giving that information out in various ways.”

See also: Data Ownership Should Be About Software, Not Lawsuits

And what isn’t being shared is being leaked. Zooko Wilcox, cypherpunk and CEO of the Electric Coin Company, likens the internet to a bucket full of holes, spilling water/data all over the floor. Using discrete privacy tools like VPNs is just plugging one of those many leaks. 

“If you have pervasive leakages, then whoever’s the most powerful gains from that,” said Wilcox. “If we have an internet in 10 years where almost everyone uses Facebook for almost everything then that’s a privacy problem that immediately leads to a power problem.”

There are abusive things in the internet structure right now.

Wilcox said the people who argue you don’t need privacy if you have nothing to hide are comfortable within the status quo. They aren’t being persecuted for dissent. They aren’t attending social justice protests in the U.S. and being digitally tracked and dossiers assembled on and they aren’t the minority Uighur Muslim population in China, which are digitally monitored and locked up in camps. 

“Privacy is just a means to an end,” said Wilcox. And that end is some level of reclaiming power from those who disproportionately possess it. 

How do we reclaim privacy?

But the question of what we mean by privacy rises again when it comes time to ensure it. Do we do so through policy and law? Through tech? Can the internet of today, the way it’s constructed, even preserve our privacy?

Jon Callas, a Senior Technology Associate at the ACLU, said the first thing he thinks of regarding a privacy-focused web as an engineer is what the requirements statement is, or the discrete goals and workflow of any project. Such specifications might work when applied to a single project, but are ill-suited for tackling something as broad and multifaceted as a private web. 

“Give me a use case and a scenario. That would be a touchstone that I could use to put things together,” said Callas. 

See also: ‘We Blew It.’ Douglas Rushkoff’s Take on the Future of the Web

Recent polling shows that 2020 could well be an inflection point for privacy, a time in which the U.S. population might be open to scrutinizing what we mean by privacy, and willing to value it in ways we haven’t previously. 

Eighty-one percent of Americans say they have little control over the data collected by companies and the government respectively, Pew Research finds. A majority thinks the risks of companies and the government collecting their data outweigh the benefits. 

Between China’s Great Firewall, the U.S. considering anti-encryption bills and the general fracturing of the internet under the guise of cyber sovereignty, a private web is more important now than ever. 

A day in 2030

You wake up in 2030. Many things look the same. You still have your computer. Your phone. Slack probably still exists. 

When you log on, navigating the web is up to you, but it’s bolstered by your own AI. The AI starts up as soon as you log on, and while you’re working, so is it. It’s trawling the web, parrying spam, searching information indicis without feeding you the top sites Google would send you to keep you in their walled garden as long as possible. Unlike AIs that are working on behalf of a company, this one has a single fiduciary responsibility – you. 

Wilcox said we already rely on algorithms and AI for many parts of our lives. Facebook’s newsfeed decides what friends we see most often. Google decides what information you get. And while sure, there is an element of convenience to that, it’s not serving you. Such technology is ultimately designed to eternally serve the company. 

“Maybe you have the same thing with AI that helps you manage your text messages for your friends, or maybe you even have one that’s loyal to your family,” said Wilcox. 

See also: What Happens if Big Tech Only Gets Bigger?

Callas echoed this idea, imagining a privacy-oriented web where an AI is monitoring your security, looking for data leaks, or filtering spam. Gmail already does something along these lines, flagging spam, and putting emails into inboxes such as primary or promotional. 

But imagine that AI writ-large existing alongside you on the internet. In ten years, the frequency of attacks and attempted data breaches is unlikely to decline. Such attacks happen with speed and execution that make it difficult for a person to counter in real-time.   

lianhao-qu-lfan1gswv5c-unsplash-4

(Lianhao Qu/Unsplash)

Alongside this, Callas said we also might need to rethink the open access nature of the internet. We have many open access systems. For example, you can call any phone number you want. You can text any number you want. But computers enabled us to send hundreds of texts in seconds to people who don’t want them. Giving people more agency and consent, through an AI like this, might mean you have to close access to some of these systems, or at least make them dependent on permissions. 

In such a scenario, someone may try to call you, only to be paused by your AI. Callas lays out a scenario in which such an AI might see that this person has written you an email before, asking you to speak. It’d then go over to LinkedIn and see there is one person you have in common and might suggest you take this call.

See also: Crypto Co-ops and Game Theory: Why the Internet Must Learn to Collaborate to Survive 

“There are abusive things in the internet structure right now,” said Callas. “So we need to have explicit relationships when it comes to information sharing, because some of it we may well be okay with.”

The challenge there is making those relationships explicit, when so much of the data we share is determined by opaque terms of service, third parties, and other data sharing agreements. 

Callas compares our current data rights to a time before food labeling, when companies didn’t have to disclose their ingredients. He can see new rules like that coming down the pipe. 

Apple, which has sought to distinguish itself among big tech companies for its privacy stance, is going to be offering a nutrition label for data of sorts that discloses what an apps collects at a glance, in its new operating system. 

There are also tools like VPNs, encryption, and other things involved. But often, to get back to Wilcox’s bucket metaphor, you’re just plugging holes that are a fundamental part of the underlying structure of the internet, at least as it stands. 

Historically, the internet did not include privacy protections, so people tried to bolt privacy onto the internet.

“Privacy is the elimination of all the holes that are exposing you to someone who would exploit or take advantage of you,” said Wilcox. “That’s not a feature. That’s like an emergent property of the whole system, the whole internet.”

For Michelle Dennedy, a privacy lawyer who has worked at Cisco, Intel, and elsewhere, it comes down to functionalizing consent. Breaking processes down to forms of authorization, and having a way to give that authorization on multiple levels, will be key.

“How long can you look at something? How long is it authorized? These things have to be explicit, and based on informed consent. When I go to the doctor and take my clothes off, I don’t expect there to be cameras in there broadcasting that business for the world to see. But that’s what we have online.”

She sees a future where we use universal modeling language to give software explicit guidelines as to how to manage privacy. What data it’s getting, why it’s getting it, where it’s being stored, who it’s being shared with, all of these questions are ones that once decisions are made, can be reinforced not just using laws or policies, but by the tech itself. 

Intractable problems

Today’s privacy controls are features bolted onto the frame of the internet itself.

“Historically, the internet did not include privacy protections, so people tried to bolt privacy onto the internet,” says Harry Halpin, a radical open-internet advocate and CEO of Nym, a privacy-tech startup. “The way they do that is create a virtual network on top of internet protocols, called an overlay network.”

From there, said Halpin, it’s a matter of disrupting packages of data that flow through the web, carrying everything from search queries to instant messages. Those pieces of data create meta data, which is essentially information about the data that is being sent. 

This, for example, is how the NSA tracked and mapped terrorist suspects’ calls, by seeing what number they called, how long they called for, and how often they called. The data about the data can tell you a lot about the data itself.

See also: The Currency Cold War: Four Scenarios

Nym is mixing up that metadata through a structure known as a mixnet, which mixes the packages of data together, repackages them, and therefore scrambles the metadata into something unintelligible from what it was before. 

Halpin recognizes, though, that Nym is operating fundamentally on the overlay network, not the very protocol of the internet itself. To really get at it, you’d have to go one layer down to the server-level control by internet service providers. 

“We don’t have access to rebuild the fundamental protocols, and even if you rebuild the protocols, you’d then need to build some of the protections into the routing network and the fundamental hardware,” said Halpin. “Which I think is possible in the future. In that way, you could imagine a completely private internet, with data that’s resistant to mass surveillance and also using very few identifiers.”

From the person-oriented AI, to further legal consent enshrined in tech, or even the server level development of privacy, there are a number of tools that can be used to plug holes in privacy. But to make a truly privacy focused web, you’d need to get rid of the honeycombed bucket we have, and develop one that doesn’t leak at all. 

The slow march of development

A privacy-oriented web would be a challenge even if there weren’t large companies and governments that were interested in preventing it. But some experts also say the internet just doesn’t move fast in its development, and that when it comes to developing a new browser, or rethinking email, those projects in and of themselves take a long time.

Callas began our conversation by discussing how we could improve email, which is one discrete part of the internet. But he said such a project would take ten years. 

And while some developers and companies lived by sayings such as “Move fast and break things,” those stakes are much higher when you’re trying to develop something for privacy. Because if it’s broken, it negates its entire reason for existing. 

See also: Say Hello to the Singularity

Dan Guido, CEO of cybersecurity firm Trail of Bits, said that, while we are likely to see mild improvements in tools like encryption, other tools like a more privacy protecting browsers would be a huge lift. He is surprised that projects like Mozilla Firefox still exist, given most browsers are developed by huge companies that have an incentive to direct users to their products. Weeks after we spoke, Mozilla laid off 250 people. 

“I think that the internet in 10 years is going to look a lot like the internet now with a few minor modifications,” said Guido. “But there will be this divergence of haves and have nots in terms of security and privacy that’s really clear and easy to see, and that grows wider every day.”

In his work as a security professional, he sees that gap most clearly in consumer-facing products versus enterprise ones. Consumer-facing browsers like Chrome or Safari are doing a better and faster job of updating their privacy than, for example, enterprise networks that prize ease of use, stability and interoperability. Just think about how hard it is to get everyone in a workplace to use two-factor authentication. 

See also: A New Era of Media Begins With Tokenization

He said that some of the major privacy protections might, a little ironically, come through big players like Apple and Google, which are working on these issues, and already have their devices in the hands of millions. 

Callas also expressed openness to vestiges of today’s internet living on in 2030. He doesn’t mind ad targeting, for example, in part because he recognizes it supports so much free stuff on the internet. But he wishes it was more accurate.

This is where the idea of reconceptualizing how we think of privacy is crucial. Because, again, the concept means different things for different people. Callas might be okay with good ads. I might not. I may be okay with a messaging service devoid of frills and run on an independent server. Some people will kill for their emojis. 

A privacy web in 2030 will likely not have everything one person wants. But whether it’s a person-focused AI, or just better encryption, it would offer greater control. Giving people more agency than they have today, where so much of what happens is opaque to the end user, seems like a good, logical step.

“In the future internet, I’ll have all the things I need, and everyone has all the things they need to give them real autonomy and real human dignity,” said Wilcox. 

“They’ll have the ability to socialize and form connections with friends and family and whoever without any third party being able to intermediate either to supply or to censor or to influence their relationships.” 

cd_internet_2030_endofarticle_banner_1500x600_generic_2

The Walking Dead: Onslaught Review

Stepping into the shoes of fan favorite Walking Dead characters like Daryl Dixon, Rick Grimes, Michonne Hawthorne, and Carol Peletier through the magic of VR certainly has its moments. The Walking Dead: Onslaught doesn’t offer nearly as nuanced an experience as its spinoff counterpart Saints & Sinners from earlier this year, but by focusing much more on the action and channeling popular elements of AMC’s TV series, it aims to scratch a different itch altogether. Weirdly, though, a lot of its mechanics don’t feel built for VR, and it never does much to contribute to Walking Dead lore. So it’s just fine if you’re here for a good old-fashioned zombie-themed arcade shooter with a lot of guts and only a few brains.

The developers at Survios don’t waste any time getting the action going. From the very first moment, Onslaught plops you into a rescue mission, hands you a hefty gun, and shows you a nice, big, shambling herd of walkers to shoot at. That’s what Onslaught is all about, and aside from some item collection, it never really moves beyond it. This is disappointing, because the premise is something I’ve wanted to experience for quite a long time as a lapsed The Walking Dead TV series fan, and this scaled-down implementation really does feel like more of a generic zombie game with a little extra walker skin stretched over it.

The Walking Dead: Onslaught Screenshots

Onslaught is split into two major modes: a short five-hour story campaign starring Norman Reedus as Daryl Dixon, and an infinitely replayable Supply Run mode where you grab as much loot as you can while outrunning an impenetrable wall of walkers. In both modes, you spend a large portion of time running up to items and grabbing them, which racks up a score that gradually unlocks new survivors and introduces new side quests, which really only boil down to rote fetch quests. They’re linked together in that progression through the main story is gated by how many survivors you’ve recruited overall, so Supply Run mode is clearly there to serve as a loot treadmill that buffers out the length of the campaign. It does double as a fun way to test out your newest and best weapons, though, so it’s generally the acceptable kind of padding.

The simple action of pushing through an entire swarm of them got my heart pumping at the best moments.


Of course, the Supply Run mode can be great fun if you just want to run around and slice through a bunch of shambling undead. The key is that Survios has made walkers genuinely fun to kill. You can grab them by the neck and go with the ol’ one-two face stab, or you can shoot them until their limbs fall off. You can also lop off their individual limbs with a katana or a fire axe. Either way, there’s usually a lot of them around you at once, and the simple action of pushing through an entire swarm of them got my heart pumping at the best moments.

That said, this is no survival game, and because of that it never really builds up any meaningful tension or dread. While Saints & Sinners makes you worry about your weapons breaking down or ammo running out at the worst possible time, scarcity isn’t a problem in Onslaught. There’s no backpack or physics-based objects to finagle with either, which ironically takes a lot away from the clumsiness-fueled tension that made surviving Saints & Sinners such a joy in VR. In fact, I never came remotely close to getting killed, so I have no idea what happens when you die. The most dangerous position I found myself in was when I stood across a room full of zombies from an important door, and even then I just brainlessly stabbed my way through and went on with my business.

In its favor, Onslaught has a nice variety of comfort and movement options that each feel well-paced for VR play. You can walk around like you would in other VR games such as Asgard’s Wrath and Saints & Sinners, or you can go with teleportation or even an arm-swinger mode. There is an offering of convenience here that goes above and beyond, and it’s refreshing to see. The arm-swinger mode, which literally makes you move when you swing your arms, is just as fun and appropriately-placed here as it is in arena games like GORN or Hot Dogs, Horseshoes, and Hand Grenades.

It’s entirely possible to clear a room of walkers by hastily stabbing your way through it.


What’s less fun is the way in which Onslaught attempts to offset its lack of challenge by making your guns feel underpowered. It’s to the point where you can unload several bullets into a walker’s head, only to have them get back up again (if you’re on higher difficulty levels). That’s pretty annoying and doesn’t feel true to the way walkers work on the show. Worse still, the reload process feels archaic and janky: instead of the traditional and satisfying interactivity of manually inserting a magazine and pulling back the slide, you just push a button and watch an animation in which your character does it for you at their own glacial pace. It looks okay, but it really slows down the natural pace of ranged combat we see in most VR shooters.

Luckily, the gun “feel” is pretty good; aiming and firing feels right, and each firearm—including the shotgun—packs the punch you’d expect from its real-life equivalent. Weapons are quickly selected and switched out in a radial menu that even slows the action down to a halt while you choose. Partially because of that, melee weapons end up being some of the most powerful and useful in Onslaught. Between reloads, you can rapidly whip out your trusty knife and the toughest walkers go down with a single well-placed thrust to the nasal cavity. Since weapons don’t break and there’s no stamina system, it’s entirely possible to clear a room of walkers by hastily stabbing your way through it. This does feel great for a little while, but it grows repetitive and tiring by the end.

The Walking Dead: Best Daryl and Carol Moments

Collecting items is fundamental to progressing through Onslaught’s campaign, but it doesn’t feel good to do. To pick up an item, you simply point and tap the trigger button to make it disappear into an invisible inventory slot. That’s something that’s expected in a traditional game but really hurts the immersion in VR. It makes Onslaught’s world feel static by comparison to what we’ve come to expect after experiencing games like Saints & Sinners and Half-Life: Alyx. Adding to this disappointment is the fact that the world is flavorless. There are no physics objects or clear inventory management system here, and much of the level design itself feels clunky. Obstacles and corridors are often placed in such a way that it’s unclear how to move through them, and I quickly noticed how many of the same decorations and buildings are reused in each level.

Alexandria is modeled exactly as it appears on TV, right down to the row of townhouses and that one solar panel.


To its credit, it’s great that the items you collect have some interesting uses. You can spend resources on upgrades for Alexandria, which serves as the primary hub town. It’s modeled exactly as it appears on TV, with some good attention to detail, right down to the row of townhouses and that one solar panel. The upgrades that you buy there in the form of structures like the Town Hall and the Forge can generously improve crucial stats like your max health and how much ammo you find, making them well worth the cost. And it’s a nice touch to see the buildings change as you improve them. On top of that, you can invest in upgrading your weapons and making them even more satisfyingly deadly.

All of this looks and sounds just fine for a VR game in 2020, but the character performances and writing are mostly lacklustre and stale. Without ruining anything, Onslaught doesn’t seem to have that much to say or add to the The Walking Dead TV universe, and there are plenty of times where the delivery of its inconsequential story feels uninspired. The best writing easily goes to Eugene, played by the show’s Josh McDermitt. His awkward one-liners are as consistently well-delivered as fans will expect from him.

Source

Letter: Top deputies accuse Texas attorney general of crimes

Several top deputies of Texas’ attorney general have accused him of crimes including bribery and abuse of office in an internal letter saying they’ve reported the actions to law enforcement

DALLAS — Several top deputies of Texas’ attorney general have reported to law enforcement that their boss engaged in crimes including bribery and abuse of office, according to an internal letter.

In a single-page letter to the director of human resources in the attorney general’s office, the seven senior lawyers wrote that they reported Republican Ken Paxton to “the appropriate law enforcement authority” for potentially breaking the law “in his official capacity as the current Attorney General of Texas.”

“We have a good faith belief that the attorney general is violating federal and/or state law including prohibitions related to improper influence, abuse of office, bribery and other potential criminal offenses,” the Thursday letter states. It was first reported jointly by the Austin American-Statesman and KVUE-TV and subsequently obtained by The Associated Press.

The letter does not offer specifics but nonetheless stands as a remarkable accusation of criminal wrongdoing against the state’s top law enforcement officer by his own staff, including some longtime supporters of his conservative Christian politics. It is likely to deepen legal trouble for Paxton, who has spent nearly his entire five years in office under felony indictment for securities fraud, although the case has stalled for years over legal challenges.

Philip Hilder, Paxton’s defense attorney in the securities case, declined to comment on the new allegations Sunday. Paxton pleaded not guilty in that case but it is not clear whether the new accusations are related.

In a statement to the American-Statesman Paxton’s office said: “The complaint filed against Attorney General Paxton was done to impede an ongoing investigation into criminal wrongdoing by public officials including employees of this office. Making false claims is a very serious matter and we plan to investigate this to the fullest extent of the law.”

A spokeswoman for the attorney general did not immediately respond to an email and phone call Sunday.

The letter was signed by the deputy attorneys general for policy, administration, civil litigation, criminal investigations and legal counsel, as well as Paxton’s first assistant, Jeff Mateer, and Mateer’s deputy. None of them responded to messages seeking comment Saturday or Sunday.

Mateer resigned from Paxton’s office Friday to rejoin a prominent conservative nonprofit law firm in the Dallas-area, according to the Dallas Morning News. The First Liberty Institute did not immediately respond to an inquiry about him Sunday.

The FBI, the U.S. Attorneys Office for the Western District of Texas and a spokesman for Gov. Greg Abbott did not immediately respond to requests for comment Sunday.

Source

CEXs Vs. DEXs: The Future Battle Lines

Since the formal introduction of Ethereum in 2014, the network has exploded with products that allow users to transact directly with one another, without relying on a third party.

One of the most common use cases is that of a decentralized exchange (DEX), an idea that dates back to Vitalik Buterin’s unveiling of Ethereum in 2014. Examining the history of how DEXs have evolved can help elucidate where DEXs are headed and how they will compete with centralized exchanges. 

What’s in a DEX?

DEXs come in a variety of forms, but share one common quality: non-custodial. DEXs use smart contracts to manage funds on-chain, so users never have to trust a third party with their money. 

However, the exchange part of a DEX – the way buyers and sellers find each other – can vary widely from one implementation to another. When thinking about the future of DEXs, it’s helpful to first understand their past.

Alex Wearn is the co-founder and CEO of IDEX, a high-performance DEX. He has spent his career in software development, including time at Amazon, Adobe, and IBM. He has been hacking on crypto startups since 2014, transitioning to full time with the launch of IDEX in 2018.

The earliest Ethereum DEXs, like EtherEx and OasisDex, built a traditional central limit order book (CLOB) exchange entirely out of Ethereum smart contracts. Developers and users quickly discovered that order management and trade execution are not well suited for a blockchain. In particular, the placing and cancelling of orders by market makers, and the interaction of traders with the on-chain order book, were expensive and error prone due to the high costs and latency of on-chain transactions.

Off-chain order books

In mid-2016, a new exchange, EtherDelta, innovated on this model by bringing the order book off-chain. This design eliminated the cost of order creation and reduced the latency and gas costs of placing an order. 

While it was a major improvement, users still incurred costs for canceling orders – a fee which prohibited market makers from providing liquidity at scale. Additionally, takers submitted their own trades to the network, creating on-chain “trade collisions,” with multiple takers competing for the same order. On peak days, up to 30% of trades failed due to these on-chain collisions.

Although the first iterations of DEXs faded over time, they were innovative, forward-thinking, and laid the groundwork for models in use today. 

Off-chain execution, on-chain settlement

Improving upon the earliest DEX models, the next generation of DEXs, including IDEX and DDEX, explored a hybrid approach. This design moved both order books and trade execution off-chain. With off-chain execution, users match their own orders but submit them to the exchange, which executes the trade and relays the order to the network for settlement. This approach eliminates the issues of on-chain trade collisions, gas fees for canceled orders, and front-running. This model served as the dominant trading model for almost two years. 

However, this design is not without flaws. Without a matching engine, trade execution suffers, and gas settlement costs and network congestion remain problematic. Identifying these drawbacks hints at even more user-friendly DEX models in the near future.

While this design was garnering the most users and volume, a novel DEX model in Uniswap and Automated Market Makers (AMMs) joined the fray.

The rise of automated market makers

The AMM design is a creative response to the limits of hosting an order book on-chain. As we’ve discussed, many of the early CLOB DEXs struggled due to the fact that it’s both expensive and slow for users to update their orders using a blockchain.

Uniswap responded by removing  the order book altogether, replacing it with a simple on-chain formula.

This architecture ultimately allowed Uniswap to achieve phenomenal growth. The “always on,” permissionless liquidity made it a great solution for other applications to build on top of. The fully-decentralized architecture has led to a resurgence of ICOs in the form of Uniswap direct listings, as projects can easily deploy their own liquidity pool to jumpstart trading of a new asset. The liquidity pool structure also makes it easy for non-technical users to commit capital and earn a passive reward from trade fees and liquidity mining.

See also: What Is DeFi?

In spite of these numerous benefits, experts speculate that AMMs in their current form are a mere stepping stone in the path of DEX design, and many question their long-term viability. As a rule, these products provide a less flexible version of market making than their centralized counterparts, and will lag in markets that require sophisticated analytics and human intervention.

Despite the many advantages of the Ethereum network, it’s clear that the traditional CLOB exchange doesn’t work well when operating on a decentralized network with such high latency and low throughput. As a result, DEX development will primarily continue down three paths: new types of AMMs, CLOBs on faster chains, and upgraded hybridized models.

New AMM models

AMMs have played an important role in DEX development, addressing key performance issues by removing the order book altogether and pricing assets using a static, on-chain function. Uniswap deployed the first example of these, the constant product function, which creates a specific type of pricing curve. Competitors like Curve have experimented with different functions, in this case choosing one that is better suited for assets where the market expects the price to be equal, such as stablecoins, or different types of wrapped bitcoin.

As these products evolve and address more specific use cases, they will likely be in demand given their on-chain availability and ease of liquidity provision. However, it is unlikely that they supplant CLOBs as the dominant form of trading, as fundamentally they are a less flexible form of exchange than CLOB.

It’s been several years since the first iteration of a crypto exchange, but systemic issues remain, creating multi-million-dollar problems for traders.

Instead of adapting to the network’s constraints, new projects like Serum are attempting to move to a different network where the constraints aren’t as severe. By using a more performant underlying network, one with higher throughput and faster consensus times, the team hopes to eliminate the UX issues that plague V1 order book DEXs.

However, at its core, trade matching and execution is a problem of consensus to determine who came first, which trades should to execute and in which order. A decentralized network, which by design has to come to consensus across a number of different nodes, can never compete at the same level of their centralized counterparts.

Hybridized models, like IDEX 2.0, aim to combine the power and performance of a centralized exchange, with the security of decentralized custody and settlement. By pairing the high-performance trading engine of a centralized exchange with the on-chain custody of a DEX, users can get the same trading experience they know and love without having to put their funds at risk. 

See also: KuCoin CEO Says Suspects in $281M Hack Identified; Authorities on the Case

DEXs have come a long way. From clunky on-chain approaches in the earliest days of 2014 to today’s wide variety of options, each evolution in DEXs has come closer to delivering a product capable of both performance and security. Regardless of what flavor they come in, the future will see DEXs challenge centralized exchanges by finally separating the custody from the exchange altogether.

It’s been several years since the first iteration of crypto exchange, but systemic issues remain, creating multi-million-dollar problems for traders. Just this week, a hacker drained Kucoin of approximately $150 million in crypto assets. That was closely followed by the U.S. Commodity Futures Trading Commission (CFTC) and the Department of Justice (DOJ) issuing a number of criminal charges against BitMEX for AML and KYC negligence.

All these issues stand as a backdrop to the ongoing battle between CEX and DEX environments. Regulatory and security concerns punctuate the growing need for traders to maintain custody of their assets and comply with regulators. Where CEXs often provide convenience for traders, they often come with security/seizure risk. DEXs, while more absolute in terms of asset control and security, offer little in terms of regulatory oversight.  

Disclosure