Skip to content

Commit

Permalink
fix: properly detect, specify, and handle connecting forked network t…
Browse files Browse the repository at this point in the history
…o non-fork [APE-1393] (#156)
  • Loading branch information
antazoey committed Sep 15, 2023
1 parent 13cbc78 commit e6b818d
Show file tree
Hide file tree
Showing 11 changed files with 147 additions and 137 deletions.
10 changes: 5 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.2.0
rev: v4.4.0
hooks:
- id: check-yaml

Expand All @@ -10,24 +10,24 @@ repos:
- id: isort

- repo: https://github.com/psf/black
rev: 23.3.0
rev: 23.9.1
hooks:
- id: black
name: black

- repo: https://github.com/pycqa/flake8
rev: 6.0.0
rev: 6.1.0
hooks:
- id: flake8

- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.991
rev: v1.5.1
hooks:
- id: mypy
additional_dependencies: [types-PyYAML, types-requests, types-setuptools, pydantic]

- repo: https://github.com/executablebooks/mdformat
rev: 0.7.16
rev: 0.7.17
hooks:
- id: mdformat
additional_dependencies: [mdformat-gfm, mdformat-frontmatter]
Expand Down
25 changes: 22 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# Quick Start

Hardhat network provider plugin for Ape. Hardhat is a development framework written in Node.js for Ethereum that includes a local network implementation.
This is a Hardhat network provider plugin for Ape.
Hardhat is a development framework written in Node.js for Ethereum that includes a local network implementation.
Use this plugin to manage a Hardhat node process or connect to an existing one.

## Dependencies

Expand Down Expand Up @@ -42,16 +44,18 @@ This network provider takes additional Hardhat-specific configuration options. T

```yaml
hardhat:
port: 8555
host: 127.0.0.1:8555
```
To select a random port, use a value of "auto":
```yaml
hardhat:
port: auto
host: auto
```
**NOTE**: If you plan on running multiple Hardhat nodes of any kind, you likely will want to use `auto` or configure multiple hosts (see examples below).

This is useful for multiprocessing and starting up multiple providers.

You can also adjust the request timeout setting:
Expand Down Expand Up @@ -86,6 +90,21 @@ Otherwise, it defaults to the default mainnet provider plugin. You can also spec

**NOTE**: Make sure you have the upstream provider plugin installed for ape.

If you wish to run both a forked network and the local Hardhat network simultaneously, you may configure a separate host for the forked network(s).

```yaml
hardhat:
fork:
ethereum:
mainnet:
upstream_provider: alchemy
host: 127.0.0.1:8555
polygon:
mainnet:
upstream_provider: alchemy
host: 127.0.0.1:8556
```

[Hardhat deployments](https://github.com/wighawag/hardhat-deploy#deploy-scripts-tags-and-dependencies) are disabled for forks for performance reasons. If you want your contract deployments to run on your fork, you can set `enable_hardhat_deployments` to `true` in your config:

```yaml
Expand Down
141 changes: 105 additions & 36 deletions ape_hardhat/provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
from eth_utils import is_0x_prefixed, is_hex, to_hex
from evm_trace import CallType
from evm_trace import TraceFrame as EvmTraceFrame
from evm_trace import get_calltree_from_geth_trace
from evm_trace import create_trace_frames, get_calltree_from_geth_trace
from hexbytes import HexBytes
from pydantic import BaseModel, Field
from semantic_version import Version # type: ignore
Expand Down Expand Up @@ -168,9 +168,28 @@ class PackageJson(BaseModel):


class HardhatForkConfig(PluginConfig):
host: Optional[Union[str, Literal["auto"]]] = None
"""
The host address or ``"auto"`` to use localhost with a random port (with attempts).
If ``host`` is specified in the root config, this will take precendence for this
network.
"""

upstream_provider: Optional[str] = None
"""
The name of the upstream provider, such as ``alchemy`` or ``infura``.
"""

block_number: Optional[int] = None
"""
The block number to fork. It is recommended to set this.
"""

enable_hardhat_deployments: bool = False
"""
Set to ``True`` if using the ``hardhat-deployments`` plugin and you wish
to have those still occur.
"""


class HardhatNetworkConfig(PluginConfig):
Expand Down Expand Up @@ -207,6 +226,54 @@ class Config:
extra = "allow"


class ForkedNetworkMetadata(BaseModel):
"""
Metadata from the RPC ``hardhat_metadata``.
"""

chain_id: int = Field(alias="chainId")
"""
The chain ID of the network being forked.
"""

fork_block_number: int = Field(alias="forkBlockNumber")
"""
The number of the block that the network forked from.
"""

fork_block_hash: str = Field(alias="forkBlockHash")
"""
The hash of the block that the network forked from.
"""


class NetworkMetadata(BaseModel):
"""
Metadata from the RPC ``hardhat_metadata``.
"""

client_version: str = Field(alias="clientVersion")
"""
A string identifying the version of Hardhat, for debugging purposes,
not meant to be displayed to users.
"""

instance_id: str = Field(alias="instanceId")
"""
A 0x-prefixed hex-encoded 32 bytes id which uniquely identifies an
instance/run of Hardhat Network. Running Hardhat Network more than
once (even with the same version and parameters) will always result
in different instanceIds. Running hardhat_reset will change the
instanceId of an existing Hardhat Network.
"""

forked_network: Optional[ForkedNetworkMetadata] = Field(None, alias="forkedNetwork")
"""
An object with information about the forked network. This field is
only present when Hardhat Network is forking another chain.
"""


def _call(*args):
return call([*args], stderr=PIPE, stdout=PIPE, stdin=PIPE)

Expand Down Expand Up @@ -305,11 +372,16 @@ def node_bin(self) -> str:
def project_folder(self) -> Path:
return self.config_manager.PROJECT_FOLDER

@property
def config_host(self) -> Optional[str]:
# NOTE: Overriden in Forked networks.
return self.config.host

@property
def uri(self) -> str:
if self._host is not None:
return self._host
if config_host := self.config.host:
if config_host := self.config_host:
if config_host == "auto":
self._host = "auto"
return self._host
Expand All @@ -327,6 +399,7 @@ def uri(self) -> str:
self._host = f"{self._host}:{DEFAULT_PORT}"
else:
self._host = f"http://127.0.0.1:{DEFAULT_PORT}"

return self._host

@property
Expand Down Expand Up @@ -368,6 +441,17 @@ def hardhat_config_file(self) -> Path:

return path.expanduser().absolute()

@property
def metadata(self) -> NetworkMetadata:
"""
Get network metadata, including an object about forked-metadata.
If the network is not a fork, it will be ``None``.
This is a helpful way of determing if a Hardhat node is a fork or not
when connecting to a remote Hardhat network.
"""
metadata = self._make_request("hardhat_metadata", [])
return NetworkMetadata.parse_obj(metadata)

@cached_property
def _test_config(self) -> TestConfig:
return cast(TestConfig, self.config_manager.get_config("test"))
Expand Down Expand Up @@ -396,15 +480,6 @@ def package_is_plugin(package: str) -> bool:

return plugins

@property
def gas_price(self) -> int:
# TODO: Remove this once Ape > 0.6.13
result = super().gas_price
if isinstance(result, str) and is_0x_prefixed(result):
return int(result, 16)

return result

def _has_hardhat_plugin(self, plugin_name: str) -> bool:
return next((True for plugin in self._hardhat_plugins if plugin == plugin_name), False)

Expand All @@ -428,7 +503,7 @@ def connect(self):
logger.warning(warning)
self._host = f"http://127.0.0.1:{self.provider_settings['port']}"

elif self.config.port != DEFAULT_PORT and self.config.host is not None:
elif self.config.port != DEFAULT_PORT and self.config_host is not None:
raise HardhatProviderError(
"Cannot use depreciated `port` field with `host`."
"Place `port` at end of `host` instead."
Expand Down Expand Up @@ -749,8 +824,7 @@ def get_transaction_trace(self, txn_hash: str) -> Iterator[TraceFrame]:
def _get_transaction_trace(self, txn_hash: str) -> Iterator[EvmTraceFrame]:
result = self._make_request("debug_traceTransaction", [txn_hash])
frames = result.get("structLogs", [])
for frame in frames:
yield EvmTraceFrame(**frame)
yield from create_trace_frames(frames)

def get_call_tree(self, txn_hash: str) -> CallTreeNode:
receipt = self.chain_manager.get_receipt(txn_hash)
Expand Down Expand Up @@ -915,31 +989,26 @@ def _upstream_provider(self) -> ProviderAPI:
# NOTE: if 'upstream_provider_name' is 'None', this gets the default mainnet provider.
return upstream_network.get_provider(provider_name=upstream_provider_name)

def connect(self):
super().connect()

# Verify that we're connected to a Hardhat node with mainnet-fork mode.
self._upstream_provider.connect()
@property
def config_host(self) -> Optional[str]:
# First, attempt to get the host from the forked config.
if host := self._fork_config.host:
return host

try:
upstream_genesis_block_hash = self._upstream_provider.get_block(0).hash
except ExtraDataLengthError as err:
if isinstance(self._upstream_provider, Web3Provider):
logger.error(
f"Upstream provider '{self._upstream_provider.name}' "
f"missing Geth PoA middleware."
)
self._upstream_provider.web3.middleware_onion.inject(geth_poa_middleware, layer=0)
upstream_genesis_block_hash = self._upstream_provider.get_block(0).hash
else:
raise HardhatProviderError(f"Unable to get genesis block: {err}.") from err
return super().config_host

self._upstream_provider.disconnect()
def connect(self):
super().connect()

if self.get_block(0).hash != upstream_genesis_block_hash:
logger.warning(
"Upstream network has mismatching genesis block. "
"This could be an issue with hardhat."
if not self.metadata.forked_network:
# This will fail when trying to connect hardhat-fork to
# a non-forked network.
raise HardhatProviderError(
"Network is no a fork. "
"Hardhat is likely already running on the local network. "
"Try using config:\n\n(ape-config.yaml)\n```\nhardhat:\n "
"host: auto\n```\n\nso that multiple processes can automatically "
"use different ports."
)

def build_command(self) -> List[str]:
Expand Down
8 changes: 4 additions & 4 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@
"rich", # Needed for trace tests
],
"lint": [
"black>=23.3.0,<24", # auto-formatter and linter
"mypy>=0.991,<1", # Static type analyzer
"black>=23.9.1,<24", # auto-formatter and linter
"mypy>=1.5.1,<2", # Static type analyzer
"types-PyYAML", # Needed due to mypy typeshed
"types-setuptools", # Needed for mypy typeshed
"types-requests", # Needed due to mypy typeshed
"flake8>=6.0.0,<7", # Style linter
"flake8>=6.1.0,<7", # Style linter
"isort>=5.10.1,<6", # Import sorting linter
"mdformat>=0.7.16", # Auto-formatter for markdown
"mdformat>=0.7.17", # Auto-formatter for markdown
"mdformat-gfm>=0.3.5", # Needed for formatting GitHub-flavored markdown
"mdformat-frontmatter>=0.4.1", # Needed for frontmatters-style headers in issue templates
],
Expand Down
7 changes: 7 additions & 0 deletions tests/ape-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,19 @@ hardhat:
mainnet:
upstream_provider: alchemy
block_number: 17040366
host: 127.0.0.1:7110
goerli:
upstream_provider: alchemy
block_number: 7849922
host: 127.0.0.1:7111
sepolia:
upstream_provider: alchemy
block_number: 3091950
host: 127.0.0.1:7112

polygon:
mumbai:
host: 127.0.0.1:7113

test:
# `false` because running pytest within pytest.
Expand Down
1 change: 0 additions & 1 deletion tests/data/contracts/ethereum/local/reverts_contract.json

This file was deleted.

This file was deleted.

Loading

0 comments on commit e6b818d

Please sign in to comment.