Ethereum
smart
contracts
today
are
typically
around
24-25kb
and
many
DeFi
ecosystem
depend
on
a
web
of
multiple
contracts.
There’s
no
reason
to
think
that
we
can’t
see
a
future
where
smart
contracts
are
in
the
megabyte
size,
including
capabilities
like
embedded
machine-learning
models
or
complex
decision
trees.
The
idea
that
we
should
have
a
25kb
limit
on
smart
contracts
will,
in
time,
seem
as
antiquated
as
the
640kb
main
memory
limit
on
early
PCs.
To
understand
how
these
scientific
improvements
will
change
the
world
of
blockchains,
it’s
worth
looking
at
how
we
got
here
in
the
first
place:
blockchains
use
lots
of
computing
power
in
a
way
that
many
would
have,
once
upon
a
time,
considered
very
wasteful.
Again,
if
you
go
back
to
the
early
days
of
computing,
memory
and
compute
resources
were
so
scarce
that
people
left
off
the
half
the
year
number
(The
“19”
in
“1985”)
to
save
space.
A
proof
of
work
system
with
thousands
of
parallel
processes
would
have
been
considered
impossibly
wasteful.
The
problem
with
blockchains
is
that
they
get
their
security
and
value
from
re-doing
stuff
repeatedly.
Everyone
is
checking
balances
and
calculations
and
verifying
them
and
trying
to
reach
consensus.
If
you
could
just
pick
one
trustworthy
party
to
manage
the
whole
process,
we
could
do
this
all
with
99%
less
effort.
The
problem
is
that
we
are,
currently,
rather
depressingly
short
of
trustworthy
central
authorities.