DEV Community

Cover image for Data Engineer as a Real-Time Algo Trader – Turning Pipelines into Profit (or at Least Trying)!
SNEHASISH DUTTA
SNEHASISH DUTTA

Posted on

Data Engineer as a Real-Time Algo Trader – Turning Pipelines into Profit (or at Least Trying)!

In an era where data drives decisions, this project explores the intersection of trading and real-time analytics.

Using Alpaca’s paper trading API, transactional data is streamed into Redpanda and analyzed with Apache Flink to extract actionable insights.
Sentiment analysis powers buy and sell signals, seamlessly delivered via Slack, creating a streamlined and responsive trading workflow.

GitHub :: https://github.com/snepar/flink-algo-trading

Data Flow Diagram (Architecture)

Architecture

Introducing the Building Blocks

This concisely conveys the purpose of the following components.

Alpaca :: Alpaca Trading APIs offer commission-free, programmatic access to U.S. stock and ETF trading through a modern REST interface. The platform stands out for its developer-friendly paper trading environment, which allows risk-free testing of trading strategies using real market data. With comprehensive SDKs in multiple languages (Python, JavaScript, Go), real-time WebSocket data streams, and support for various order types (market, limit, stop), Alpaca enables developers to quickly build and test algorithmic trading systems. Whether you're developing a trading strategy or building a full-scale automated trading platform, Alpaca's combination of zero-cost paper trading and production-ready infrastructure makes it an ideal choice for both learning and deployment.

VADER :: (Valence Aware Dictionary and sEntiment Reasoner) is a powerful sentiment analysis tool specifically designed for social media text. It's part of the NLTK library and uses a combination of lexical features and rule-based analysis to assess text sentiment.
Key Features:

  • Pre-built sentiment lexicon with emotion-word ratings
  • Handles informal language, emojis, and social media context
  • Considers punctuation and capitalization for emphasis
  • Returns scores for positive, negative, neutral, and compound sentiment
  • No training required (rule-based approach)

Advantages:

  • Social Media Optimized: Accurately handles slang, emoticons, and informal language common in social media posts
  • Fast Processing: Being rule-based, it's computationally efficient and doesn't require model training
  • Context Sensitive: Understands sentiment intensifiers, contractions, and negations
  • Multiple Scores: Provides granular sentiment analysis with separate scores for different sentiments
  • Easy Integration: Simple to use with just a few lines of code
  • Domain Adaptable: Works well across various domains, from social media to product reviews

The compound score (-1 to +1) makes it particularly useful for quick sentiment classification, while the individual positive, negative, and neutral scores provide deeper insight into the text's emotional content.

Redpanda :: Redpanda is a modern streaming data platform designed as a Kafka API-compatible alternative with significantly improved performance and simplified operations. It's written in C++ and provides a zero-copy architecture, eliminating the need for a separate JVM, Zookeeper, or replication controller. Redpanda offers real-time data streaming with sub-millisecond latency, making it ideal for high-throughput scenarios like algorithmic trading. The platform stands out for its self-tuning capabilities, transparent data replication, and ability to handle millions of events per second while maintaining data consistency.
Key advantages include:

  1. Kafka API compatibility without configuration overhead
  2. Single binary deployment with no external dependencies
  3. Hardware optimized performance with lower resource consumption
  4. Built-in disaster recovery and data durability
  5. Simple scaling and maintenance without complex configurations

Flink-SQL :: Flink SQL is a powerful query interface in Apache Flink that enables real-time stream processing and analytics using standard SQL syntax. It allows developers to write SQL queries that can analyze both streaming and batch data without changing the underlying code. What sets Flink SQL apart is its ability to handle continuous queries over streaming data, with support for event time processing, windowing operations, and complex event pattern matching.
Key features include:

  1. Real-time continuous querying
  2. Advanced window operations (sliding, tumbling, session)
  3. Stream-table joins and temporal table joins
  4. Pattern detection using MATCH_RECOGNIZE
  5. Built-in connectors for various data sources/sinks
  6. Dynamic table concepts for stream processing
  7. Support for user-defined functions (UDFs)

Py-Flink :: PyFlink is Python's API for Apache Flink, offering a powerful stream processing framework with the accessibility of Python. It enables developers to write scalable stream processing applications using familiar Python syntax while leveraging Flink's robust distributed computing capabilities. PyFlink supports both the DataStream API for low-level stream processing and the Table API/SQL for declarative data analytics, making it particularly valuable for real-time data analysis and complex event processing.
Key features include:

  1. Native Python UDFs (User Defined Functions)
  2. Seamless integration with Python data science libraries
  3. Support for both batch and stream processing
  4. Real-time data analytics using SQL
  5. Stateful stream processing capabilities
  6. Event-time processing and windowing operations

The combination of Python's ease of use with Flink's performance makes PyFlink an excellent choice for building real-time data pipelines and streaming analytics applications.

Goal

  • Setup Paper Trading: using Alpaca
  • Data Ingestion: using Kafka APIs into RedPanda
  • Streaming Extract Transform and Aggregate: using Flink SQL
  • Generate Trade Signals : using Flink Source APIs to Slack
  • Backtest : Algorithmic Trading Strategies

Let Us Begin

>> Paper-Trading With Alpaca

Sign Up for Trading API

AlpacaLogin

Copy The API Keys

AlpacaKeys

Configure the keys in your Python Project

pip install alpaca_trade_api

config = {
    'key_id': ' API KEY ',
    'secret_key': ' SECRET KEY ',
    'redpanda_brokers': 'localhost:9092,localhost:9093',
    'base_url': 'https://data.alpaca.markets/v1beta1/',
    'trade_api_base_url': 'https://paper-api.alpaca.markets/v2',
    'slack_token': '',
    'slack_channel_id': ''
} 
Enter fullscreen mode Exit fullscreen mode

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/alpaca_config/keys.py

>> Infrastructure With Docker

Configure docker-compose.yml
And Dockerfile-sql

services:
  redpanda-1:
...
  redpanda-console:
      ports:
        - 8080:8080
# Flink cluster
  jobmanager:
    container_name: jobmanager
    build:
      context: .
      dockerfile: Dockerfile-sql  
      ports:
        - 8081:8081
   sql-client:
     container_name: sql-client
     build:
      context: .
      dockerfile: Dockerfile-sql
     command:
       - /opt/flink/bin/sql-client.sh
...

Enter fullscreen mode Exit fullscreen mode

Execute Command docker compose up -d --build

Check the Status of RedPanda
http://localhost:8080/overview

redpanda-overview

Check the Status of Flink SQL
http://localhost:8081/#/overview

flink-overview

Test Flink-SQL Client
Execute Command docker compose run sql-client
Execute Test Query Flink SQL> select 'hello world';

flink-sql-hw

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/docker-compose.yml

>> The Redpanda Producer With Sentiment Analysis from Past News Headlines

Install NLTK Libraries
pip install nltk
Or Alternatively python -m nltk.downloader vader_lexicon
Or skip ssl and run

import nltk
import ssl

try:
  _create_unverified_https_context = ssl._create_unverified_context
except AttributeError:
  pass

else:
  ssl._create_default_https_context = _create_unverified_https_context

nltk.download()
Enter fullscreen mode Exit fullscreen mode

Select The Appropriate Model from the list

vader-downloader

Define the sentiment analyser function

from nltk.sentiment.vader import SentimentIntensityAnalyzer as SIA

sia = SIA()


def get_sentiment(text):
    scores = sia.polarity_scores(text)
    return scores['compound']
Enter fullscreen mode Exit fullscreen mode

Install Kafka Libraries to Produce Data in RedPanda

pip install kafka-python requests pandas

def get_producer(brokers: List[str]):
    producer = KafkaProducer(
        bootstrap_servers=brokers,
        key_serializer=str.encode,
        value_serializer=lambda v: json.dumps(v).encode('utf-8')
    )
    return producer
Enter fullscreen mode Exit fullscreen mode

Define the Kafka Producer for fetching Historical News From Alpaca and Publish to a Topic in RedPanda of a certain date range

def produce_historical_news(
        redpanda_client: KafkaProducer,
        start_date: str,
        end_date: str,
        symbols: List[str],
        topic: str
    ):
    key_id = config['key_id']
    secret_key = config['secret_key']
    base_url = config['base_url']

    api = REST(key_id=key_id, secret_key=secret_key, base_url=URL(base_url)) ... 
Enter fullscreen mode Exit fullscreen mode

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/news-producer.py

Invoke the Sentiment Function to calculate Sentiment Score Based on the News Headline

article = row._raw
            should_proceed = any(term in article['headline'] for term in symbols)
            if not should_proceed:
                continue
article['sentiment'] = get_sentiment(article['headline'])
Enter fullscreen mode Exit fullscreen mode

Publish to RedPanda Topic market-news For the Company Name Apple / AAPL

produce_historical_news(
        get_producer(config['redpanda_brokers']),
        topic='market-news',
        start_date='2024-01-01',
        end_date='2024-12-08',
        symbols=['AAPL', 'Apple']
    )
Enter fullscreen mode Exit fullscreen mode

Check the Topic market-news in RedPanda UI at http://localhost:8080/topics/market-news

histnews

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/news-producer.py

>> The Redpanda Producer for Historical and Real-Time Stock Price Changes

Use the Alpaca StockBarsRequest API

start_date = datetime.strptime(start_date, '%Y-%m-%d')
end_date = datetime.strptime(end_date, '%Y-%m-%d')
granularity = TimeFrame.Minute

request_params = StockBarsRequest(
        symbol_or_symbols=symbol,
        timeframe=granularity,
        start=start_date,
        end=end_date)

prices_df = api.get_stock_bars(request_params).df
Enter fullscreen mode Exit fullscreen mode

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/prices-producer.py

Fetch the informations as follows in Response it also includes the Volume Weighted Average Price

{
  "symbol":"AAPL"
  "timestamp":1707246180000
  "open":188.7199
  "high":188.74
  "low":188.585
  "close":188.64
  "volume":108418
  "trade_count":1313
  "vwap":188.660311
  "provider":"alpaca"
}
Enter fullscreen mode Exit fullscreen mode

Check the Topic stock-prices in RedPanda UI at http://localhost:8080/topics/stock-prices for key AAPL

pricehist

>> Flink SQL Based Table Creation on RedPanda Topics

Market News Table
This table captures financial news data from Kafka with sentiment analysis:

  • Stores news articles with metadata (author, headline, source)
  • Includes sentiment scores for each news item
  • Uses event time processing with 5-second watermark
  • Connects to Redpanda topic market-news for data streaming
CREATE OR REPLACE TABLE market_news (
    id BIGINT,
    author VARCHAR,
    headline VARCHAR,
    source VARCHAR,
    summary VARCHAR,
    data_provider VARCHAR,
    `url` VARCHAR,
    symbol VARCHAR,
    sentiment DECIMAL,
    timestamp_ms BIGINT,
    event_time AS TO_TIMESTAMP_LTZ(timestamp_ms, 3),
    WATERMARK FOR event_time AS event_time - INTERVAL '5' SECOND
) WITH (
    'connector' = 'kafka',
    'topic' = 'market-news',
    'properties.bootstrap.servers' = 'redpanda-1:29092,redpanda-2:29092',
    'properties.group.id' = 'test-group',
    'properties.auto.offset.reset' = 'earliest',
    'format' = 'json'
);
Enter fullscreen mode Exit fullscreen mode

Stock Prices Table
Captures real-time stock price data

  • Stores OHLCV (Open, High, Low, Close, Volume) data
  • Includes additional metrics like VWAP and trade count
  • Uses event time processing with 5-second watermark
  • Streams data from a separate RedPanda topic stock-prices
CREATE OR REPLACE TABLE stock_prices (
    symbol VARCHAR,
    `open` FLOAT,
    high FLOAT,
    low FLOAT,
    `close` FLOAT,
    volume DECIMAL,
    trade_count FLOAT,
    vwap DECIMAL,
    provider VARCHAR,
    `timestamp` BIGINT,
    event_time AS TO_TIMESTAMP_LTZ(`timestamp`, 3),
    WATERMARK FOR event_time AS event_time - INTERVAL '5' SECOND
) WITH (
    'connector' = 'kafka',
    'topic' = 'stock-prices',
    'properties.bootstrap.servers' = 'redpanda-1:29092,redpanda-2:29093',
    'properties.group.id' = 'test-group',
    'properties.auto.offset.reset' = 'earliest',
    'format' = 'json'
);
Enter fullscreen mode Exit fullscreen mode

>> Real-Time Aggregation Using Flink-SQL and Simple Moving Average (Algorithm)

Moving Average Views (sma_20 and sma_50)

  • Creates 20-period and 50-period simple moving averages
  • Uses window functions for continuous calculation
  • Partitions by symbol for multiple stock analysis
  • Maintains temporal order using event_time
CREATE OR REPLACE VIEW sma_20 AS
SELECT symbol, `close`, event_time,
    AVG(`close`) OVER (PARTITION BY symbol ORDER BY event_time ROWS BETWEEN 19 PRECEDING AND CURRENT ROW) AS sma_20
FROM stock_prices;
Enter fullscreen mode Exit fullscreen mode
CREATE OR REPLACE VIEW sma_50 AS
SELECT
    symbol,
    `close`,
    event_time,
    AVG(`close`) OVER (PARTITION BY symbol ORDER BY event_time ROWS BETWEEN 49 PRECEDING AND CURRENT ROW) AS sma_50
FROM stock_prices;
Enter fullscreen mode Exit fullscreen mode

Price with Moving Averages View

  • Combines both SMAs (20 and 50 period)
  • Joins the moving averages on symbol and event_time
  • Provides a consolidated view of price and technical indicators
CREATE OR REPLACE VIEW price_with_movavg AS
SELECT
    s20.symbol,
    s20.`close`,
    s20.event_time,
    s20.sma_20,
    s50.sma_50
FROM sma_20 s20
JOIN sma_50 s50
    ON s20.symbol = s50.symbol AND s20.event_time = s50.event_time;
Enter fullscreen mode Exit fullscreen mode

News and Prices View

  • Correlates news events with price movements
  • Uses a 2-minute window (±1 minute) to match news with prices
  • Combines sentiment data with technical indicators
  • Enables analysis of news impact on price
CREATE OR REPLACE VIEW news_and_prices AS
SELECT
    n.symbol,
    n.headline,
    n.sentiment,
    p.`close`,
    p.sma_20,
    p.sma_50,
    n.event_time AS news_time,
    p.event_time AS price_time
FROM market_news n
JOIN price_with_movavg p
    ON n.symbol = p.symbol
    AND n.event_time BETWEEN p.event_time - INTERVAL '1' MINUTE AND p.event_time + INTERVAL '1' MINUTE;
Enter fullscreen mode Exit fullscreen mode

Trading Signals View
Implements the trading strategy:

BUY Signal Conditions:

  • Positive sentiment (sentiment > 0)
  • Price crosses below SMA20 (current < SMA20 && previous >= SMA20)

SELL Signal Conditions:

  • Negative sentiment (sentiment < 0)
  • Price crosses above SMA20 (current > SMA20 && previous <= SMA20)

Uses LAG function for price crossover detection

CREATE OR REPLACE VIEW trading_signals AS
SELECT
    symbol,
    news_time,
    `close`,
    1 as quantity,
    CASE
        WHEN sentiment > 0 AND `close` < sma_20 AND lag(`close`, 1) OVER (PARTITION BY symbol ORDER BY news_time) >= sma_20 THEN 'BUY'
        WHEN sentiment < 0 AND `close` > sma_20 AND lag(`close`, 1) OVER (PARTITION BY symbol ORDER BY news_time) <= sma_20 THEN 'SELL'
        ELSE NULL
    END AS signal
FROM news_and_prices;
Enter fullscreen mode Exit fullscreen mode

>> Publish Trading Signals to a Topic in RedPanda using Flink-SQL

Create a topic in RedPanda trading-signals

createtopic

Create a Table using Flink SQL

CREATE OR REPLACE TABLE trading_signals_sink (
    symbol STRING,
    signal_time TIMESTAMP_LTZ,
    signal STRING
) WITH (
    'connector' = 'kafka',
    'topic' = 'trading-signals',
    'properties.bootstrap.servers' = 'redpanda-1:29092, redpanda-2:29092',
    'properties.group.id' = 'test-group',
    'format' = 'json'
);
Enter fullscreen mode Exit fullscreen mode

Start Publishing Trade Signals to the topic

INSERT INTO trading_signals_sink
SELECT symbol, news_time, signal
FROM trading_signals
WHERE signal IS NOT NULL;
Enter fullscreen mode Exit fullscreen mode

Verify Data from RedPanda UI

trade-signal-topic

>> PyFlink to Consume Trade Signals

Install Apache Flink Library pip install apache-flink

  • Create Flink Kafka Consumer Group
  • Add Relevant JAVA JARs
env = StreamExecutionEnvironment.get_execution_environment()
    env.set_parallelism(4)
    env.add_jars('<<location to>>/flink-sql-connector-kafka-3.1.0-1.18.jar')

    kafka_consumer_properties = {
        'bootstrap.servers': 'localhost:9092,localhost:9093',
        'group.id': 'news_trading_consumer_group',
        'auto.offset.reset': 'earliest'
    }

    kafka_consumer = FlinkKafkaConsumer(
        topics='trading-signals',
        deserialization_schema=SimpleStringSchema(),
        properties=kafka_consumer_properties
    )

kafka_stream = env.add_source(kafka_consumer, type_info=Types.STRING())
Enter fullscreen mode Exit fullscreen mode
  • Process The Event
message_dict = json.loads(message)
        symbol = message_dict.get('symbol', 'N/A')
        signal_time = message_dict.get('signal_time', 'N/A')
        signal = message_dict.get('signal', 'N/A')

        formatted_message = """
        =============================
        ALERT ⚠️ New Trading Signal!\n
        Symbol: {symbol} \n
        Signal: {signal} \n
        Time: {signal_time}
        =============================
        """.format(
            symbol=symbol,
            signal=signal,
            signal_time=signal_time
        )
Enter fullscreen mode Exit fullscreen mode

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/signal_handler.py

>> Slack to Post Alerts

Configure A Slack Channel with a BOT to Publish Message OAuth Scope = chat:write

SlackSettings

Start Pushing Alert Messages to This Channel

def send_to_slack(message, token, channel_id):
    url = 'https://slack.com/api/chat.postMessage'
    headers = {
        'Content-Type': 'application/json',
        'Authorization': f'Bearer {token}'
    }

    data = {
        'channel': channel_id,
        'text': message
    }

    response = requests.post(url, headers=headers, json=data)

    if response.status_code != 200:
        raise ValueError(f'Failed to send message to slack, {response.status_code}, response: {response.text}')
Enter fullscreen mode Exit fullscreen mode

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/signal_handler.py

  • Alerts in Slack

SlackAlerts

>> Place Order Based on Trade Signals

  • Configure the Trade API to place order based on Buy and Sell Signal in Alpaca Broker
def place_order(symbol, qty, side, order_type, time_in_force):
    try:
        order = api.submit_order(
            symbol=symbol,
            qty=qty,
            side=side,
            type=order_type,
            time_in_force=time_in_force
        )
        print(f'Order submitted: {order}')
        return order
    except Exception as e:
        print(f'An error occured while submitting order {e}')
        return None
Enter fullscreen mode Exit fullscreen mode

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/signal_handler.py

  • Console Log

order

  • Alpaca Dashboard

AlpacaDashboard1

AlpacaDashboard2

>> BackTesting The Algorithm Strategies

All Weather Strategy :: This implements Ray Dalio's All Weather strategy, which aims to perform well in any economic environment by balancing growth assets with protection against different economic scenarios (inflation, deflation, growth, recession).

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/strategies/AllWeatherStrategy.py

import backtrader as bt

class AllWeatherStrategy(bt.Strategy):
    def __init__(self):
        self.year_last_rebalanced = -1
        self.weights = {"VTI": 0.30, 'TLT': 0.40, 'IEF': 0.15, 'GLD': 0.075, 'DBC': 0.075}

    def next(self):
        if self.datetime.date().year == self.year_last_rebalanced:
            return

        self.year_last_rebalanced = self.datetime.date().year

        for i, d in enumerate(self.datas):
            symbol = d._name
            self.order_target_percent(d, target=self.weights[symbol])
Enter fullscreen mode Exit fullscreen mode

This code defines a trading strategy class AllWeatherStrategy that inherits from Backtrader's Strategy class. The strategy implements annual portfolio rebalancing with predefined asset allocations:

In the __init__ method:

self.year_last_rebalanced = -1: Tracks the last rebalancing year
self.weights dictionary defines the asset allocation:

30% in VTI (Vanguard Total Stock Market ETF) - Growth
40% in TLT (Long-term Treasury Bonds) - Deflation protection
15% in IEF (Intermediate Treasury Bonds) - Income
7.5% in GLD (Gold) - Inflation protection
7.5% in DBC (Commodity Index) - Inflation protection

The next method executes the rebalancing logic:

Checks if we're in a new year (compared to last rebalance)
If it's a new year:

Updates the last rebalanced year
Iterates through each asset in the portfolio
Uses order_target_percent to adjust each position to match its target weight

Golden Cross Strategy ::

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/strategies/GoldenCrossStrategy.py

import backtrader as bt

class GoldenCrossStrategy(bt.Strategy):
    params = (
        ('short_window', 50),
        ('long_window', 200)
    )

    def __init__(self):
        self.short_ema = bt.indicators.EMA(self.datas[0].close, period=self.params.short_window)
        self.long_ema = bt.indicators.EMA(self.datas[0].close, period=self.params.long_window)
        self.crossover = bt.indicators.CrossOver(self.short_ema, self.long_ema)

    def next(self):
        if not self.position:
            if self.crossover > 0:
                self.buy()
        elif self.crossover < 0:
            self.close()
Enter fullscreen mode Exit fullscreen mode

Golden Cross trading strategy, which is a popular technical analysis method.

The GoldenCrossStrategy class defines a trend-following strategy based on exponential moving average (EMA) crossovers:

  • Strategy Parameters (params):
  1. short_window = 50: 50-day EMA period
  2. long_window = 200: 200-day EMA period
  • In the __init__ method: Creates two EMAs using closing prices:
  1. short_ema: 50-day EMA (faster moving average)
  2. long_ema: 200-day EMA (slower moving average)

Creates a crossover indicator to detect when the EMAs cross

The next method contains trading logic:
If no position is held (if not self.position):
Buys when short EMA crosses above long EMA (crossover > 0)
If holding a position:
Sells when short EMA crosses below long EMA (crossover < 0)

This strategy follows the principle that:

A "Golden Cross" (short EMA crossing above long EMA) signals an uptrend and triggers a buy
A "Death Cross" (short EMA crossing below long EMA) signals a downtrend and triggers a sell
Enter fullscreen mode Exit fullscreen mode

MomentumStrategy Backtesting Implementation

Key features of this strategy:

  1. Uses momentum to identify strong upward price movements
  2. Uses EMA as a trailing stop mechanism
  3. Combines momentum and trend following concepts
  4. Momentum > 100 indicates price is moving up significantly
  5. Price below EMA suggests trend might be weakening

The strategy aims to:

  1. Catch strong upward price movements (momentum > 100)
  2. Stay in the trade while trend remains positive
  3. Exit when trend weakens (price < EMA)

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/strategies/MomentumStrategy.py

import backtrader as bt

class MomentumStrategy(bt.Strategy):
    params = (
        ('momentum_period', 12),
        ('exit_period', 26)
    )

    def __init__(self):
        self.momentum = bt.indicators.MomentumOscillator(
            self.datas[0].close, period=self.params.momentum_period)
        self.exit_signal = bt.indicators.EMA(
            self.datas[0].close, period=self.params.exit_period)

    def next(self):
        if not self.position:
            if self.momentum > 100:
                self.buy()
        elif self.datas[0].close[0] < self.exit_signal[0]:
            self.close()
Enter fullscreen mode Exit fullscreen mode

*Test Using Backtester (Momentum Strategy) *

  • Backtrader backtesting script integrates with Alpaca's API.
  • Creates Backtrader's main engine (Cerebro)
  • Sets initial cash amount
  • Adds the specified trading strategy
  • Runs the backtest
  • Prints initial and final portfolio values
  • Calculates percentage return
  • Reports Sharpe Ratio
  • Generates performance plots
def run_backtest(strategy, symbols, start, end, timeframe, cash):
    rest_api = REST(config['key_id'], config['secret_key'], base_url=config['trade_api_base_url'])

    #initialize backtrader broker
    cerebro = bt.Cerebro(stdstats=True)
    cerebro.broker.setcash(cash)

    # add strategy
    cerebro.addstrategy(strategy)

    # add analytics
    cerebro.addobserver(bt.observers.Value)
    cerebro.addobserver(bt.observers.BuySell)

    cerebro.addanalyzer(bt.analyzers.SharpeRatio, _name='mysharpre')
Enter fullscreen mode Exit fullscreen mode

GitHub :: https://github.com/snepar/flink-algo-trading/blob/main/backtester.py

Momentum

>> Paper Vs Real Verifications

Paper Trading

PaperView

Real Trading

TradingView


References

Top comments (0)