From Kafka Stream
Ingest hydrated events from a hosted Kafka stream (as compared to dehydrated events from gRPC hub)
With Kafka, you can subscribe to the same data that we use for sending webhook notifications
To get entire dataset, Kafka is best paired with one of our other data products (such as Parquet )
Kafka is not suitable to build a database with all of the data from Farcaster day 1. Our kafka topics currently keep data for 14 days. It’s a good solution for streaming recent data in real time (P95 data latency of <1.5s).
Why
If you’re using Hub gRPC streaming, you’re getting dehydrated events that you have to put together yourself later to make useful (see here for example). With Neynar’s Kafka stream, you get a fully hydrated event (e.g., user.created) that you can use in your app/product immediately. See the example between the gRPC hub event and the Kafka event below.
How
- Reach out, we will create credentials for you and send them via 1Password.
- For authentication, the connection requires
SASL/SCRAM SHA512
. - The connection requires TLS (sometimes called SSL for legacy reasons) for encryption.
farcaster-mainnet-events
is the primary topic name. There may be more topics in the future, but for now, there is just one. It has two partitions.
There are three brokers available over the Internet. Provide them all to your client:
b-1-public.tfmskneynar.5vlahy.c11.kafka.us-east-1.amazonaws.com:9196
b-2-public.tfmskneynar.5vlahy.c11.kafka.us-east-1.amazonaws.com:9196
b-3-public.tfmskneynar.5vlahy.c11.kafka.us-east-1.amazonaws.com:9196
Most clients accept the brokers as a comma-separated list:
You can use kcat
(formerly kafkacat
) to test things locally:
Example output:
Consumer nodejs example
https://github.com/neynarxyz/farcaster-examples/tree/main/neynar-webhook-kafka-consumer
Data schema
Was this page helpful?