* Fix MACD WarmUpPeriod and Updating
* Add System to use Math library
* Fix WarmUpPeriod math and add tests
* Fix SchaffTrendCycle Indicator WarmUpPeriod
* Ensure fastPeriod < slowPeriod
* Minor tweaks
- Remove unrequired Math.Max operation
- Remove unrequired changes in solution file
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* Update RandomDataGeneratorProgram.cs
Replaced AddMonths(6) with a Datetime value half way between settings.Start and settings.End.
* Update RandomDataGeneratorProgram.cs
* Update to bug-5030-CFDdatanotoworking
* Added midpoint unit test
* Minor test assert improvement
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* LeanDataReader future option data
- The LeanDataReader will be able to read an entire zip of future and
option data, currently it's just returning the first entry in the zip
file. Adding unit tests
* Fix added unit test data path
* Universe dispose and removal
- Fix bug where in some cases the disposed universe would not get removed
from UniverseManager not allowing new universes to get added correctly,
using the same symbol. Adding regression test
- Fix bug where in some cases selected symbols would not get removed
when the parent universe would get disposed of.
- Minor normalization on UniverseSelection call
* Address review
- Add status plots for the added regression test
* Adding regression algorithm reproducing issue 3914
* change filename template. add fillforward resolution suffix
* replicate GH issue 5116 on master
it's easy to reproduce on FXCM market by using data resolution different from fillforward resolution.
in this cases daily vs hour/minute were added
* change priorities of Time & EndTime values.
it's necessary to calculate EndTime properly, and then we can align Time to it.
potential end time should be also calculated using ptoper TZ. Because of we store open hours without TZ movement TZ in market-hours it's necessary to reapply TZ
fix tests
* Miss 2AM bar on Sunday of DST; FF 2AM bar on Sunday ST
* use UTC TimeZone as baseline during comparison
* Refactors Market Fill Model
Create `GetBidPrice` and `GetAskPrice` methods to request the most suitable price to fill market orders.
Fix unit tests to show that new implementation prevents using non-market data to fill the order.
* Addresses Peer Review …
Modifies EquityTickQuoteAdjustedModeRegressionAlgorithm to reflect changes in the market fill method. The previous implementation only looks for the last tick when it should look for the last tick of quote type, so current implementation is an improvement.
* Fixes for IB malformed symbols
- Add handling for IB future malformed symbols. Adding unit test
- Fix IB option malformed symbols which had more whitespaces than
expected. Adding unit test
* Address review
* Adds data parsing for malformed option contracts in IBBrokerage
* Some contracts come in malformed from IB when downloading our
account holdings. We attempt to detect whenever this happens with
our options Symbols and try to recover the contract.
* Addresses review and fixes bug
* Uses contract currency for the new contract created from the
malformed contract.
* Bug fix: Sets ibSymbol to new Symbol of the contract once the
malformed contract has been parsed.
* Adds support for Futures and FOPs
* Refactoring code and cleanup
* Addresses review: code cleanup and refactoring
* Addresses review: remove redundant logging of contract in log statement
* Silence DotNet in building stubs
* Reapply silence to stub publishing
* Have QuantBook store and restore initial LogHandler
* Add assembly action to maintain LogHandler
* Allow AlgorithmRunner to use ConsoleLogHandler
* Use MaintainLogHandlerAttribute to provide current test logger
* Refactor LogHandler creation
* The test didn't take into account that the contract expiring on
December would mean that there would temporarily be no Dec. FOP
contract. We fix this by looking for the Mar. contract instead
if the december contract has expired.
* added test covering minValue / maxValue issue with JsonRoundingConverter
* change namespaces
* JsonRoundingConverter fix decimal.Min and MaxValues (cannot deserialize)
* remove dependency on 3rd party library
* c# 6 compatible code (remove pattern matching)
* Fixes BacktestResultPacket deserializing
- Serializing decimals as strings to avoid precision loss, since json
convert will use floating point precision. Updating unit tests.
- Fix logging unit test failing to delete file being used.
Co-authored-by: Mark Virchenko <mark.virchenko@calienteam.com>
* test
* wip
* Revert "Fix duplicated history entries when contains daylight saving time change (#4700)"
Use proper rounding down
* regression test
* remove unused parameters
* more tests
* fix name and comment
* improve regression test
* more tests: oanda market hours
* re-apply Exchange TZ to bar EndTime
* fix expected results
* we can't substract minute because it can harm algorithm on minute resolution; so we could use tick?
* rename prop: conflict with QCAlgorithm.StartDate
* do not log messages to pass travis ci log limit
* assign loghandler in AlgorithmSetupHandler
* reference to PR for more description
* due to https://github.com/QuantConnect/Lean/pull/5039 we don't need to override it manually
* Adding Api.ReadBacktest optional getCharts
- Optionally allow users not to fetch backtest charts when reading a
backtest, it can be slow
* Fix null reference for deleted/cancelled backtests
* Get Leaky bucket config settings once
* Reduce Travis setup verbosity
* Introduce ConsoleErrorLogHandler
* Change Console.WriteLine to Log statements
* Quiet wget
* Route build stubs stdout to null
* Fix Quantbook history test
* Silence stub packages directly
* Use parameterized log-handler for testing
* Rename AssemblyInitialize Setup
* Fix AlgorithmRunner file logging
* Drop all overriden LogHandlers in tests
* Change to OneTimeSetup to maintain LogHandlers
* Permit any ILogHandler to be defined in params
* Fix for AlgorithmRunner Handlers V2
This layer is designated for common settings that enforce your team preferences for the current solution.
Since we don't have specific resharper settings we want to share within the team we ignore this file
* Adds DeM indicator
* Added reference to param movingaverage type
* Fixed variable declarations
* Added nameless initialize
* Missing DeM "type" args added
* Missing DeM "type" args added
* refactor
* Undid _previousInput → protected
* Demarker symbol: DeM →DEM
* Symbol change: DeM → DEM
* Updated symbols
TestDivByZero originally had dem as cmf.
* Symbol: DeM →DEM
Co-authored-by: Alexandre Catarino <AlexCatarino@users.noreply.github.com>
- Adding optimization Id to backtests packets
- SeriesSampler will allow truncating the samples
- Removing OptimizationEstimate, simplifying getting estimate and
runtime stats separatly
* Fixes intraday delistings not occurring for Futures and FOPs
* Previously, we would wait until the next market open to
liquidate futures and futures options contracts. Since these
contracts can not be traded at the next market open and require
intraday delisting, changes were made to liquidate at the first
available place where we know the market is open. This means
we now liquidate futures and FOPs intraday as a market order.
* Maintains backwards compatability with equities and equity options
delisting behavior
* Addresses review: adds additional protections for ProcessDelistedSymbols
* We choose to adjust the delisting date to the next market open only
if the market is not open at the current time, otherwise the time
would have been adjusted to the market open of the next trading day
* Addresses review: reverts changes and fixes error message in regression algo
* Adds Futures Options History Research Support
* Address review: make canonical future throw when calling GetOptionHistory
* Improve error message, recommending users to user FutureChainProvider
* Adds the awesome oscillator.
* added missing type hint for AO
* cleaned initializations
* refactor in call for AO(fast,slow,type)
* added missing type parameter for AO
* Changes AO sub-indicators to public.
Co-authored-by: Alexandre Catarino <AlexCatarino@users.noreply.github.com>
* Fixed Python runtime issue that was occurring when trying to generate reports locally on OSX/mono, but assume the issue impacts all configurations.
* Move Python.Runtime config to common
* Remove duplicate files
* Update readme
* Typo
* Change destination in build directory
Co-authored-by: Charles Naccio <cnaccio@gmail.com>
* Added CMF indicator
CMF is a volume-weighted average of accumulation and distribution over a period.
* Added initializer for CMF
Registration for ChaikinMoneyFlow implemented.
* Added CMF tests.
* Added CMF tests.
* spy_cmf.txt changed to external indicator data.
* Implement suggestions of @AlexCatarino
* added sum terms as subindicators.
* added sum terms as subindicators.
* Removal of vestigial rolling window
* Minor nit changes
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* WIP Backtest adjustments
* Workaround awaiting changes
* Adjustments to API Changes
* Nit fix
* Custom Newtonsoft Deserializer for AlphaRuntimeStatistics
* Workaround Travis build error
* Drop AlphaRuntimeStatistics converter; use Decimal converter
* Use StringDecimalJsonConverter
* Undo set properties
* Add more members to de-serializer class
* Adds preliminary universe selection for Future Options
* Fixes scaling issues with Future Options
* Fixes scaling multiplying by 10000x instead of using _scaleFactor
* Fixes scaling for Tick
* Revert changes to Tick since it divides the scaling factor
* Changes stale method name to new method name after rebase
* Fixes selection bugs, adds new methods, and adds unit tests
* Fixes bug where Equity Symbol was created for an underlying
non-equity Symbol, resulting in equity data trying to be loaded
* Adds unit tests covering changes to Tick, QuoteBar, TradeBar and
LeanData
* Adds regression test for AddUniverseOption filter contract selection
for Future Options
* Addresses review - modifies the AddFutureOption signature
* Adds new AddUniverseOptions method overload
* Removes and adds a new unit test
* Misc. modifications to account for new changes
* Fixes bug where futures were loaded using default SID Date
* Refactors and removes unnecessary work
* Fixes regression algorithm, which previously made no trades
* Adds future option data
* Adds the corresponding underlying data, in this case, futures data
to enable usage of future options data
* Replaces data with new data (ES18Z20)
* Improves Future chain filtering and updates regression stats
* Add AddFutureOptionContract API
* Expands regression and unit tests to test in finer detail
* Adds Python regression algorithms for AddFutureOption[Contract] methods
* Adds new unit test for BacktestingOptionChainProvider
* Fixes bug with BacktesingOptionChainProvider where we
attempted to load the Trades option chain first, resulting
in breakage of backwards compatibility and limitation of the
option chain.
* Adds new regression algorithms (Py) to Algorithm.Python project
* Adds FutureOptionMarginBuyingPowerModel
* Modifies code paths used to select margin model
* Adds related unit tests for margin model
* Fixes issue with unit test and MHDB/SPDB lookup for Future Options
* Preliminary regression algorithm testing ITM call/put option buying
* Fixes bug where fee model used did not find non-US market
options fee model. We now use the futures fee model for future
options because IB charges the same commissions per contract
between futures and futures options
* Adds proper regression algorithm for ITM future options expiration
* Pushing broken algorithm for review
* Currently, algorithm does not fill forward, causing
a single future option to not get exercised when it is delisted.
* Adds FutureOptionPutITMExpiryRegressionAlgorithm
* Improves existing regression algorithm for call side
* Fixes bug in existing regression algorithm
* Adds AAPL daily data to advance enumerator for ^^^ fix
* Adds additional future option regression algorithms
* Adds Buy OTM expiration regression algorithms
* Adds Sell ITM/OTM expiration regression algorithms
* Adds missing Python regression algorithms
* Adds remaining Python regression algorithms and fixes issues
* Fixes naming issues and statistics
* Adds short option OTM regression algorithms (Py)
* Add license header and class comments to python algorithms
* Cleans up comments and docstrings
* Create Buy/Sell call intraday regression algo
* Redirects future options symbol properties to futures symbol properties
* Asserts exercise/assignment price and updates stats in regression algos
* Adds new unit test covering changes to SecurityService
* Adds comments and fixes failing test
* Partially fixes future option mis-calculated profit/loss
* Adjusts portfolio model to calculate FOP as a no upfront pay asset class
* Updates regression algorithm statistics
* Begin IB FOP support
* Initial support for FOP IB data streaming, live í¾
* Adds additional functionality to LiveOptionChainProvider
- Allows querying CME API to retrieve option chains for CME products
- Ultimately, it's also the groundwork for the CME
LiveFutureChainProvider
* Edits IDataQueueUniverseProvider interface to provide greater
control to implementors of it
* Misc. bug fixes required to get FOP data streaming through IB
* Adds comments, adds missing rategate call, and cleans up code
* Force exchange for FOP and Futures when no exchange is provided
* Fixes bug with Portfolio modeling across all asset classes
* Adds LiveOptionChainProvider tests for Future Options
* IB brokerage option symbol bug fixes and improvements
* Fixes contract multiplier lookup bug
* Fixes issue where we attempted to subscribe to IB data feed with canonical security
* Adds ES MHDB entry
* Reverts portfolio modeling changes for Futures Options
* Since IB eats into our account's cash balance when
a new FOP contract is purchased, we must model by applying funds
to our cash whenever a new purchase/sell occurs.
If we choose to model FOPs exactly as we do with futures, we
will end up with an invalid TotalPortfolioValue on algorithm
restart. By all means and purposes, FOPs are modeled exactly
the same as equity options with respect to the portfolio.
* Adds comments clarifying portfolio modeling and clarifies
existing portfolio modeling comments with additional context.
* Fixes IB symbol lookup for future options
* Fixes LiveOptionChainProvider looping 5 times per option chain
request, even on success
* Sets OptionChainedUniverseSelectionModel to produce a canonical
future/future option/option Symbol to avoid creating two Symbols
* Adds GLOBEX future option symbol mapping from future -> fop
* Fixes LiveOptionChainProvider loading wrong contract option chains
* Fixes loading of futures options ZIP files when backtesting
* Adds a string -> decimal JSON converter
* Additional fixes/refactoring to the LiveOptionChainProvider
* Adds tests for changes to Symbol and LeanData
* Reverts changes to IB-symbol-map
* Fixes Value for mapped future options tickers
* Fixes Symbol test
* Changes path of future options to future's expiry date
* Extra changes made to remove scaling from writing CSV
* Added method to map from FOP Globex -> FUT Globex
* Fixes MOO and MOC orders for future options
* Note: this order type might not be supported by IB or CME.
* Bug fixes and updates unit tests
* Update regression tests and data format
* Rebase changes
* 1. Multiple bug fixes for LiveOptionChainProvider, reverts IQFeed changes
2. Address review (partial): Code reuse and cleanup
1.
* Modifies check in
`AddFutureOptionShort(Call|Put)ITMExpiryRegressionAlgorithm`
to ensure no buys have negative quantity
* Code reuse changes in IB brokerage
* Bug fix in IB brokerage where we assigned the FOP expiry
as the futures expiry (requires verification)
* Doc changes and adds missing summaries/license banners
* Disposes of HTTP client resources in LiveOptionChainProvider
* Renames classes and adds FutureOption folder in Common/Securities
2.
* We revert back to the quotes API for the option chain,
since the settlement API sometimes had missing strikes.
* Fixes future option expiry being set as future's expiry
in LiveOptionChainProvider
* Fixes bug where wrong option chain was selected because of bad
expiry lookup in the futures expiries returned from CME
* Fixes multiple looping bug in LiveOptionChainProvider
* Adds strike price scaling for LiveOptionChainProvider
* Reverts IQFeed changes and simplifies interface upgrade changes
Some additional challenges we'll have to solve as part of FOPs:
- The `OptionSymbol.IsStandard` method makes the assumption that
weeklies contracts follow the pattern equities follows, which
does not apply to Futures Options
- The Subscription created in:
`OptionChainUniverseSubscriptionEnumeratorFactory`
...adds a Trade config. For illiquid contracts, this
will delay universe selection for the option symbol
until we get a trade. However, if we add a quote config,
the data would instead be loaded based on the first quote
we received from the brokerage.
But since we're currently using a trade config, illiquid
contracts won't start streaming data until it receives a trade.
NOTE: this commit is a WIP to addressing the reviews received in the PR,
but has been committed early for efficiency in the review process
* Fixes regression algorithms and misc. bugs
* Fixes map file lookup for non-equity options
* Adds extra assertion at end of algorithm to ensure no holdings are
left when the algorithm ends.
* Adds FutureOptionSymbol, allowing all contracts through as standard
* Changes SPDB to allow defaulting to underlying future symbol
properties if no entry is found for the given FOP
* Fixes calls to SPDB in SecurityService, IBBrokerage
* Reverts AAPL daily ZIP file to fix majority of regression algorithms
* Adds FOPs symbol properties
* Fixes existing symbol properties for a few futures
* Adds tests for changes to Symbol Properties Database
* Removes string SPDB lookup method
* Updates tests and misc callees of previous method
* Updates all regression tests to use data of already expired contracts
* Adds Futures Options Expiry Functions tests
* Adds required futures data for 2020-01-05
* Address review (partial): Expands test coverage and fixes tests
* Set option chain tests parallelism to fixture only
* Fixes broken test for contract month delta for FuturesOptionsExpiryFunctions
* Changes delisting date logic for Futures Options
* Address review: removes duplicate code, misc code fixes
* Bug fix in MarketHoursDatabase.GetDatabaseSymbolKey() where
we would use the underlying's Symbol for lookup in the MHDB
* Adds missing license banner
* Removes Futures Options entries from MHDB
* Adds new tests
* Adds SecurityType.FutureOption
* Converts any underlying comparisons and uses SecurityType directly
instead for FOP specific behavior
* Extra code modifications to acommodate new SecurityType
* Addresses review: fixes order fee bug on exercise
* Additional bug fixes and adding of SecurityType.FutureOption
* Updates regression algorithms OrderListHash
* Fixes various bugs in IB live implementation
* Fixes bug setting the right contract expiration date for FOP
generated by LiveOptionChainProvider
* Adds new function to FuturesOptionsExpiryFunctions
* Clarifies parameter names better in some functions/methods
* Fixes bugs in IB brokerage for FOPs
* Address review - code cleanup and refactor
* Remove MappingEventProvider, SplitEventProvider, and
DividendEventProvider for Futures Options in
CorporateEventEnumeratorFactory
* Address review: Use MHDB key resolver in SPDB
* Makes regression tests pass and adds comment for expiry issue
* Fixes MHDB lookup on string symbol method
* Adds Futures Options greeks regression algorithm (C# only)
* Adds explanitory comment on MHDB FOP lookup
* Remove python from FutureOptionCallITMGreeksExpiryRegressionAlgorithm
* Bump time by one tick in adding subscription
* Adjust regressions
* Change removal to immediate
* Use series.AddPoint instead of directly adding it
* Adjust regression
* Add checks for fixed behavior in regressions
* Address review
* initial commit
* run parametrized algorithm with command line parameters
* skeleton: top level structure
* OptimizationNodePacket scheme
* pass parameters as HashSet
* run Lean and read results
* call method on optimization completion
* refactor public interfaces
- close ParameterSet collection; allow only get operations
- explicit method to start LeanOptimizer
* synchronize RunLean method; the result could come in before the backtest id is set in the collections
* another portion of refactoring and interface changes
* comments
* comments & tests for Extremum, Minimization and Maximization classes
* unify optimization paramater values (min, max, step) & mode GridSearch tests
- swap min&max if necessary
- iterate left => right (negate step value if necessary) & provide default step value if step == 0
- no StackOverflow Exception
- parameterSet Id should be global for current generator and retain between steps
- test signle point boundary (min == max)
* BruteForceStrategy tests
* more comments
* Update Optimizer assembly information
- Update Optimizer projects assembly information to match behavior of
the other projects
* Tweaks
- Adding comments
- Replace OnComplete for Ended event
- Replace Abort for Dispose
- ConsoleLeanOptimizer will keep track of running processes
- Each backtest will store results in a separated directory, so they
don't fight for the log.txt file.
- Adding cmdline option for lean to close automatically
- Adding concurrent execution backtest limit
- Console optimizer will start Lean minimized
- Escape spaces in Json path
* remove parameter set generator abstraction layer
we don't need this flexibility now.
* refactor public methods; Step shouldn't be public
* constraints: wip
* define contract
* comparison operators and tests
* specify JsonProperty values
* Move SafeMultiply100 to extensions
* Throw exception on failed Optimizer.Start
* constraints: wip
* change finish & dispose process
* minor fixes
- handle force lean abort
- notify consumer if target has been reached
* target & constraints; adapt unit tests
* Minor Tweaks and fixes
- Some logging improvements
- Remove Public since not required
* Ignore empty ParameterValue
* simplify condition
* avoid reinitialization
* reduce type; force immutable
* unit tests for constraints and target value
* parse & normalize percent values, i.e. 20% => 0.2
* fixup
* Target & Constraint & OptimizationNodePacket unit tests
* Add more json unit tests
- Adding more json conversion unit tests. Fix bug for Extremum which
wasn't using the converter.
* LeanOptimizer tests
* Estimation results
* User thread safe counters
* LeanOptimizer unit tests; push OptimizationResult on Ended event
* more unit tests
* Minor tweaks
-Estimate ToString in a single line.
-Typos and missing header file
* Add base SendUpdate method
- Add base SendUpdate method for LeanOptimizer
* fix LeanOptimizer test; rely on internal Update rather than timer
* Add OptimizationStatus
- Add missing commments and OptimizationStatus
* EulerSearch implementation: wip
* OptimizationParameter custom converter
* change the type
* make step optional
* change folder structure
* enumerate optimization parameter using IEnumerable & IEnumerator
* unit tests: parameters & objectives
* unit tests: strategies
* remove redundant TODO
* change Euler search boundaries
* more Euler tests
* prevent race condition
* Add account/read endpoint
- Adding account/read endpoint. Adding unit test
* Add status check before running lean
* Minor self review
- Adding missing comments, minor changes
* remove array parameters
* minor changes
- tidy up config file, rename variable
- accept min less or equal than max
* move OptimizationParameter methods to strategies
* Minor improvements for BaseResultHandler derivates
* minor changes
- strict requirements for Step and MinStep values
- strategy specific settigs
* Add TotalRuntime to estimate
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* Reformat/cleanup OptionStrategies
This file was breaking pretty much every style convention in LEAN.
There are other things that should be addressed in here that weren't,
such as passing non-argument names as argument names for ArgumentException,
as well as preferring constructors over property initializer syntax, but
such changes aren't being made to keep this commit strictly reformatting
instead of refactoring.
Added braces and reformatted long lines to make code more legible.
* Add abstract base class for OptionStrategy Option/UnderlyingLegData
This allows us to create either or and later use the Invoke method to push it
into the appropriate list on OptionStrategy.
* Replace O(n) option contract search with 2 O(1) TryGetValue calls
A better improvement would be resolving the correct symbol in the strategy, but
this immediate change is instead just focused on removing the O(n) search inside
a loop.
* Add BinaryComparison and supporting methods in ExpressionBuilder
We're going to use these binary comparisons to make it possible to create
ad-hoc queries against a collection of symbols. Using these expressions,
along with type supporting composition of these expression, we'll be able
to define predicates that can declaratively define how to match an option
strategy with an algorithms current holdings.
* Make GetValueOrDefault defaultValue optional
Was receiving ambiguous invocations leading to neading to invoke this
method explicitly (LinqExtensions.GetValueOrDefault) instead of being
able to use it as an extension method. Making the default value optional
seems to have resolved this ambiguity, leading to cleaner code in the
OptionPositionCollection (forthcoming)
* Add OptionPosition and OptionPositionCollection
OptionPositionCollection aims to provide a single coherent interface
for querying an algorithm's option contract positions and the underlying
equity's position in a performant, immutable way. The immutability of
the type is necessary for how the options matcher will operate. We need
to recursively evaluate potential matches, each step down the stack removing
positions from the collection consumed by each leg matched. This will enable
parallelism of the solution as well as simplifying the mental model for
understanding due to not needing to track mutations to the collection
instance.
* Add Option test class for easily creating option symbol objects
* Add OptionStrategyLegPredicate and OptionStrategyLegDefinition
The definition is a composition of predicates, and each predicate supports
matching against a set of pre-existing legs and a current position being
checked for the next leg (this leg). In addition to the matching functionality,
it also supports filtering the OptionPositionCollection, which is where much
of the work for resolving potential option strategies is done. By successively
filtering the OptionPositionCollection through successive application of predicates,
we wil end up with a small set of remaining positions that can be individually
evaluated for best margin impacts.
All of this effectively unrolls into a giant evaluation tree. Because of this
inherent structure, common in combinatorial optimization, the OptionPositionCollection
is an immutable type to support concurrent evaluations of different branches of
the tree. For large position collections this will dramatically improve strategy
resolution times. Finally, the interface between the predicate and the positions
collection is purposefully thin and provides a target for future optimizations.
* Add OptionStrategyDefinition and OptionStrategyDefinitions pre-defined definitions
The OptionStrategyDefinition is a definitional object provided a template and functions
used to match algorithm holdings (via OptionPositionCollection) to this definition. The
definition defines a particular way in which option positions can be combined in order to
achieve a more favorable margin requirement, thereby allowing the algorithm to hold more
positions than otherwise possible. This ties into the existing OptionStrategy classes and
the end result of the matching process will be OptionStrategy instances definiing all
strategies matched according to the provided definitions.
* Add OptionStrategyMatcher and Options class, w/ supporting types
OptionStrategyMatcherOptions aims to provide some knobs and dials to control how
the matcher behaves, and more importantly, which positions get prioritized when
matching. Prioritization is controlled via two different enumerators, one controller
which definitions are matched first and the other controller which positions are
matched first. Still unimplemented, is computing multiple solutions and running the
provided objective function to determine the best match. When this gets implemented,
we'll also want to implement the timer. For anyone looking to implement these features,
please talk with Michael Handschuh as there's a particular way of representing these
types of combinatorial solutions (a 3D tree) that can be used as a variation of the
linear simplex method for optimizing combinatorial problems.
* OptionStrategyMatcher: Address PR review comments
* Ensure created OptionStrategy legs all have the same multiplier
Each leg definition match gets it's own multiplier which indicates the
maximum number of times we matched that particular leg. When we finish
matching all legs, we pick the smallest multiplier from all the legs in
the definition and use that as the definition's multiplier. When we go
to create the OptionStrategy object we MUST make sure we're using the
multiplier from the definition and not from the individual legs.
This change fixes this issue and also provides a guard clause to ensure
that we're not trying to use a multiplier larger than what was matched.
* Add XML docs for OptionStrategyDefinitions from OptionStrategies
* Fixes inability to parse negative strike prices in SecurityIdentifier
* Adds new tests ensuring backwards compat and no throwing w/ negative
strike prices
* Changes strategy used to support negative strike prices
* We add support for negative strike prices at the cost of
reducing the maximum allowed precision for the strike price.
We encode a negative sign into the 20th bit of the strike price
and set our bounds for precision to a max (exclusive) of 475712.
This in turn is then used to form a negative strike when rebuilding
the SID.
* Adds tests covering changes
* Address review: adds additional tests and refactors code
* Address self-review: remove unused import in SymbolTests
* Address review: adds additional test cases for OptionStyle and OptionRight
* These tests are to ensure that backwards compatibility is maintained
* Addresses review: Adds option chain <-> master SID hash test
* Refactors previous tests to reduce on code duplication
* Reduce test duplication
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* Fix typo in data dir volume mount permission.
* Change default notebook launch behaviour for Mac systems.
Co-authored-by: Athon Millane <athon@maxkelsen.com>
* Reenable extended market hours sampling
- Will re enable extended market hours sampling using the last benchmark
and portfolio value when user exchange were open.
* Add unit test minor tweak
- Improve shutdown latency.
- Setting portfolio initial value
- Adding unit test asserting sample behavior.
* Remove python stubs directory
- Removing python stubs directory since after https://github.com/QuantConnect/Lean/pull/4899
it was been replaced by a python package `quantconnect-stubs`.
- Reverting IDE settings using the stubs folder PR https://github.com/QuantConnect/Lean/pull/4657
* Revert "Adds Python stubs location definition for PyCharm and Visual Studio Code (#4657)"
This reverts commit aded66ec5b.
* Address self-review: Provide list of imports and refactor python readme
Co-authored-by: Gerardo Salazar <gsalaz9800@gmail.com>
* Update crypto entries in SPBD
- Update values for GDAX and Bitfinex, with latest values from exchanges
- Update symbols with >3 letters
- [Fix] Remove same base-quote entries
* Update unit tests
Increment in precision given smaller lot sizes in SPDB crypto entries.
* Add GDAX symbol properties downloader (unit test)
* Update GDAX symbol properties database
* Add BrokerageSymbol to symbol properties database
* Update GDAX symbol properties
* Address review
- Rename BrokerageSymbol to MarketTicker
* Add GDAX symbol mapper
* Update GDAXBrokerage to use symbol mapper
* Fix GDAX brokerage unit tests
* Replace GDAXSymbolMapper with SymbolPropertiesDatabaseSymbolMapper
* Address review
- use Symbol key in dictionaries
* Rename BrokerageSymbol to MarketTicker
* Save GDAX history in the real ticker folder
* rename tickerMapper to symbolMapper
* fix gdaxdownloader help message
* Save GDAX history in the real ticker folder
* rename tickerMapper to symbolMapper
* fix gdaxdownloader help message
* use SymbolPropertiesDatabaseSymbolMapper
* address review
Co-authored-by: JJD <jjdambrosio@gmail.com>
Co-authored-by: Stefano Raggi <stefano.raggi67@gmail.com>
Co-authored-by: Martin-Molinero <martin@quantconnect.com>
* Update crypto entries in SPBD
- Update values for GDAX and Bitfinex, with latest values from exchanges
- Update symbols with >3 letters
- [Fix] Remove same base-quote entries
* Update unit tests
Increment in precision given smaller lot sizes in SPDB crypto entries.
* Add GDAX symbol properties downloader (unit test)
* Update GDAX symbol properties database
* Add BrokerageSymbol to symbol properties database
* Update GDAX symbol properties
* Address review
- Rename BrokerageSymbol to MarketTicker
* Add GDAX symbol mapper
* Update GDAXBrokerage to use symbol mapper
* Fix GDAX brokerage unit tests
* Replace GDAXSymbolMapper with SymbolPropertiesDatabaseSymbolMapper
* Address review
- use Symbol key in dictionaries
* Rename BrokerageSymbol to MarketTicker
* Update unit test
* Add Bitfinex symbol properties downloader (unit test)
* Add Bitfinex symbol market ticker to downloader
* Update Bitfinex symbol properties database
* Update BitfinexBrokerage to use SymbolPropertiesDatabaseSymbolMapper
* Remove Bitfinex test symbols from db
* Update Binance symbol properties database
* Update BinanceBrokerage to use SymbolPropertiesDatabaseSymbolMapper
* Add missing Binance test case
* Update symbol properties database
- gdax, bitfinex, binance
* Update CoinApi symbol mapper for new SPDB
* Exclude Bitfinex BCH pre-2018-fork in CoinApiSymbolMapper
* Remove unused properties
* Add CoinApi mappings
Co-authored-by: JJD <jjdambrosio@gmail.com>
* Fixes weeklies parsing, causing certain futures to be inaccessible in Algorithm
The FuturesExpiryFunction expects the contract month of the Future,
not the expiration. As a result, the contract gets filtered as a
weekly contract, rather than as a standard due to the discrepancy
between the expiry dates when the contract month differs from the
expiry date's month.
A very important fact to note is that futures can and do expire prior
to the contract month. BZ,(brent crude financial futures) expire two
months prior to the contract month, CL one month prior, etc.
There has been an addition that contains a "reverse" futures expiry function
lookup table. We use this to lookup the contract month to re-calculate
the Future expiry.
This PR also fixes dairy and adds extra expiry dates. Dairy can have
an expiry *after* the contract month, so a new path was added to the
SymbolRepresentation to ensure that these contracts are loaded
correctly.
* Address review: Adds tests and fixes bug in SymbolRepresentation
* Updates SID comment on `Date` property to reflect fact that we use
future expiry for its value
* Fixes bug in SymbolRepresentation where expiration day would always
be 01 when parsing a contract with an expiration after the contract
month
* Fixes bug in SybmolRepresentation where expiration year would be
four digits long when parsing a contract with an expiration after
the contract month
* Fixes some bad dairy expiry dates
* Adds tests for SymbolRepresentation and the futures filtering for
standard contracts
* Renames method used to extract delta between contract month and
expiry date
* Removes GH comment and restores Futures contract month expiry param
* Limit rounding interval to 1 day, add SubtractRoundDown and AddRoundUp functions
* fix error message
* Only subtract round down if period is greater than a day
* Tests
* Comment clarification
* Pivot solution to simple fix
* fix error message
* Remove RoundDown/Up limitations; add remarks
* address review
* Fixes 4815 by loading the requested assembly from different folder.
# Conflicts:
# ToolBox/Program.cs
* Upgrades System.Collections.Immutable to Version=1.2.5.0
* Creates a prototype for SSE streaming in IEXDataQueueHandler.
* Revert the changes in Tick.cs
* Implements a logic in IEXDataQueueHandler that updates the data-feed subscription after Subscribe/Unsubscribe
* Implements IEXCouldSubscribeMoreThan100Symbols - which fails and other small fixes.
* Implements DoForEach LinqExtensions
* Implements IEXEventSourceCollection that wraps all logic that is SSE-subscriptions and symbol-limits-per-connection concerned.
* Changes:
1) Fixes to address review.
2) Makes IexMarketPercent in QuoteSSE nullable as null values are assigned to in this field in data object received before the traing session start.
3) Deprecates helper Subscribe/Unsubscribe in IEXDataQueueHandler and IEXCouldSubscribe test.
* Fixes:
1) _refreshEvent.Reset() order was not correct - should be called before UpdateSubscription
2) ProcessJsonObject- leaves only the functionality to emit ticks.
3) IEXEventSourceCollection - replaces int counter with CountdownEvent to improve the logic - in particular, need a mechanism that would not allow the repeated call to continue until the first one is completed
* Refines the logic with parsing a data snapshot.
* Fixes few more bugs:
1) Logic in ProcessJsonObject
2) Logic in UpdateSubscription - need to introduce additional ManualResetEvent to implement the intended logic - otherwise the logic is not suitable for general case
* Introduce rate-gate limit in IEXEventSourceCollection:
because when subscribing to a bunch of shares (more than 200 for example)
the violation of rate gate policy may occur, which described in API docs asRequest Limits
IEX Cloud only applies request limits per IP address to ensure system stability.
We limit requests to 100 per second per IP measured in milliseconds, so no more than 1 request per 10 milliseconds.
SSE endpoints are limited to 50 symbols per connection. You can make multiple connections if you need to consume more than 50 symbols.:
* Few additional fixes done after real time testing
* Adds xml-docs in stream response object + renaming a file.
* Fixes:
1) Additional StreamResponseStocksUS parsing issues, that can happen outside of regular exchange hours.
2) Cancel clientUpdateThread by means of CancellationTokenSource
3) Replace BuildSymbolsQuery by string.Join
* Fixes:
1) Changing Log Trace -> Debug
2) Adds ConfigureAwait(false) to async method call
3) Removes direct reference to System.Net.Http
* Removes a task and manual reset event in IEXEventSourceCollection
* Additions:
1) IEXEventSourceCollectionSubscribes test
2) GetSnpStocksArray() helper method
3) Installs packages in QC.tests : HtmlAgilityPack & LaunchDarkly.EventSource
* IEX history provider fixes :
1) Tiny bug in ProcessJsonObject - use continue instead of return as execution is inside the for-each block)
2) Adds period variable for the historical data retrieved
3) Fixing from if (date.Date < start.Date || date.Date > end.Date) conditional check --> if (date < start || date > end) for more precise sorting.
* Changes:
1) Removes HtmlAgilityPack and SNP scraper
2) Uses hard coded symbols instead
* Bug fix:
- at certain hours (example: before pre-market open or on holidays) IEX may send no data on subscription - when trying to connect during those hours Message handler may not be fired - need to place the counter signal to client.Opened to be informed of successful connect.
* Implements:
1) IEXEventSourceCollectionSubscriptionThoroughTest and MockedIEXEventSourceCollection
2) Makes changes to IEXEventSourceCollection accordingly to allow the thorough testing.
* Fixes formatting issue in StreamResponseStocksUS
* Small fix for a new tests:
- Change RemovedClientSymbols to keep not clients itself, but symbols array, because clients are being disposed right further
* Enables extended logging in Toolbox.
* Fixing IEX historical data fetcher bugs:
1) Bug in IEXDataDownloader.cs - HistoryRequest not precisely correct.
2) Enables day-by-day daily bar downloading in IEXDataQueueHandler.
Motivation: Suppose we need data for some interval in the past - from-date=20170915-00:00:00 --to-date=20171103-00:00:00.
With current behavior IEX would have to download all the historical data from =20170915-00:00:00 up to this day.
3) Extends SynchronizingHistoryProvider
* Enables async fashion historical data download
* More fixes to IEXCouldGetHistory test.
* Reverts day-by-day daily bar downloading and other fixes.
* Removes needless packages & references
* Fix package reference
* To address review
* Sort out zero price ticks:
after testing on real-time algo 30 min before the market open now - IEX may send updates for many securities with zero lastPrice, lastSize - fix to sort such entries out
* Workaround for missing QuoteTicks timestamps:
Since we don't have a stamp for quote tick updates (only for trades) we calculate the average delay between trade tick's time stamp and local time, and
assuming that delay in average is the same for quote updates - just assign the local machine time adjusted for this average
* Simplifies the things.
* Changes:
1) Deprecates quote updates for IEX stocks.
2) Reduce the stream updates to reduce costs to ->
# Stock Quotes every 1 second (per symbol? )
# Can be up to 54,000 messages per symbol per day
https://iexcloud.io/docs/api/#how-messages-work
* Fixes:
1) IEXDataQueueHandler: give an error message on extended market hours or tick resolution subscription request. As they are not really well supported by IEX.
2) Few small fixes in IEXEventSourceCollection, including additional condition for when the subscription remains irrevocable.
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Delete QuiverHouseDataDownloader.cs
* Delete QuiverSenateDataDownloader.cs
* Delete QuiverPoliticalBetaDataDownloader.cs
* Add files via upload
* Delete QuiverHouse.cs
* Delete QuiverSenate.cs
* Delete QuiverPoliticalBeta.cs
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Add files via upload
* Delete QuiverDataAlgorithm.cs
* Add files via upload
* Add files via upload
* Add files via upload
* Addresses self review: Cleans up code and adds new unit tests
* Adds Quiver* C# files to project
* Adds new unit test for QuiverCongress
* Adds Python algorithm example
* Address self reviews
- Adding some missing xml docs
- Removing unrequired imports.
- Minor rename from Date to ReportDate
- Live trading will throw InvalidOperationException
* Fixes for example algorithms
Co-authored-by: Gerardo Salazar <gsalaz9800@gmail.com>
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* Add offset capabilities and related tests
* Add tests for Weekend Offsets on Symbol DateRules
* Unify Iterator Behavior
* Refactor and consolidate functions to reduce duplicate code
* Positive offset values only
* Address review
* Expand Month tests to include Forex and Crypto cases
* Ensure order of days in schedule
* Refactor and unify behavior
* More edge cases and tuning
* Address review + more tests
* Refactor and hotfix run script
Removed dependency on `realpath`, notably absent from macOS and some debian distros. Replaced with an equivalent bash function.
Cleaned up prompt response handling, replaced var tests with param expressions.
Removed potentially uneeded calls to sudo (again absent on some systems and most users are part of the docker group anyway). Instead, test if it's needed and get preauth with sudo -v.
Poll for running and stopped Lean containers, and prompt before replacing them. Would fail prior.
Uppercased variabled. Sorry.
* Revert docker output redirection to stderr
* Adjustments to maintain all functionality
* Mimic behavior across run scripts
* Update readme to reflect Docker script changes
* Ignore results storage
* Correct print statement
* Mirror changes on research docker scripts
* Make executable
* Handle already running container and sudo privs
* Fix for issues found in testing
* Address Review
* Small adjustments found in linux testing
* Doc improvement
* Add auto update option for Docker images
* Fix image var reference
Co-authored-by: Peter Kazazes <peter@peterk.co>
* DividedEventProvider distribution computation
- Update regression algorithm which was using a different reference
price when calculating the dividend
- Adjust divided event provider to compute distribution using factor
file reference price, if not 0. Adding unit tests
- For equities, only emit auxiliary data points for
TradeBar configurations, not for QuoteBars, nor internal.
* Address reviews
- Split and Dividend event provider will throw an exception when there
is no reference price available. Updating `wm` factor file which was
missing references price and regression algorithms using WM.
- Updating unit tests asserting new exception
* Protobuf will use recyclable memory stream
- Serialization will reuse recyclable memory streams
- Remove serialization of exchange and sale condition for ticks.
Updating unit tests.
- There is no need to serialize BaseData.EndTime, covered by unit tests.
* Tick will keep a parsed sale condition property
* Readd tick exchange. Json ignore ParsedSaleCondition
* Implement future type filter
* Filter weeklys test
* Fix for test, contracts were being filtered out by new type filter
* Share core contract filtering logic in new base class
* Catch Future symbols we don't have Expiry functions for
* Expand tests for new filtering
* Address review
* Small doc change
* Compare Date component for expiry
* Clarifying comment
* Implements Equity Fill Model
This commit sets the base to create a new equity fill model and the `EquityFillModel` is just a copy of `FillModel`.
* Adds Summary to FillModelPythonWrapper.GetPricesInternal
Adds summary to FillModelPythonWrapper.GetPricesInternal with remarks that it's a temporarily method to help the refactoring of fill models.
* Change GetSubscriptionDataConfigs to return IEnumerable<SubscriptionDataConfig>
* Move localTime outside loop
* Remove legacy code
* Remove unnecessary OrderBy
* Reorder conditions to reduce number of times Contains() is called
* Revert "Remove unnecessary OrderBy"
This reverts commit 85383b062e.
* Revert "Change GetSubscriptionDataConfigs to return IEnumerable<SubscriptionDataConfig>"
This reverts commit cbd97c9f36.
* Add RequestId to request information logging
* Use unique request id across all request types (orders, subscriptions, data queries)
- Previously we had three separate counters for request types and this was causing request information messages to be overwritten (different request types with same ids)
* Update GetContractDetails to log all contracts found
* Store temp files in subdirectory
* Fix Dispose case for new temp dir
* Adjust tests for new temp dir
* Dispose unit tests
* Unit test for issue 4811
* Refactor for not using temp files
* Fix storage checks for saving data, plus tests
* Use Base64 for storing keys and decoding them; handles odd key strings
* Don't allow "?" in a key
* Address review
* Deleted test cases
* PersistData handle deletion of files
* Refactor GetFilePath to use Persist()
* Make PathForKey protected
* Add decimal places as parameters to get dividends with arbitrary precision
Also, increase precision when generating strings from factor files rows.
* Add xml documentation entries for new optional arguments.
* Adds CustomBuyingPowerModelAlgorithm
This algorithms is an example on how to implement a custom buying power model.
In this particular case, it shows how to override `HasSufficientBuyingPowerForOrder` in order to place orders without sufficient buying power according to the default model.
* Upgrades CustomModelsAlgorithm to Include CustomBuyingPowerModel
The custom buying power model overrides `HasSufficientBuyingPowerForOrderResult` but it doesn't change the trades and, consequently, the regression statistics.
* Allow account currency to be overridden by the algorithm
* Fix failing unit test
* Address review
- Revert removal of call check in SecurityPortfolioManager.SetAccountCurrency
- Revert changes to unit tests
- IBrokerage.AccountBaseCurrency now defaults to null
- BrokerageSetupHandler will not change the algorithm's account currency if the brokerage returns null, allowing the algorithm to call SetAccountCurrency in Initialize
* Fail on restart investing after liquidation
I added a line so that the trailing high value could be rebalanced and the investment process won't be stop by high value always more than current value by drawdown percent.
* Update MaximumDrawdownPercentPortfolio.py
* Fix for MaximumDrawdownPercentPortfolio
- Fix C# MaximumDrawdownPercentPortfolio to reset portfolio value after
liquidation. Only reset once we have actually adjusted some targets.
Updating regression algorithms.
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* Remove F# Project, Splits, and Dividends Tests
* Separate tests that require external accounts; read from config
* Removal of non supported "prices" endpoint test
* Removal of unsupported API functions
* Address review
* NOP GetLastPrice for removal of Prices endpoint
* Post rebase fix
* Rebase fix 2
* remove /r/n from eof for api tests
* Reflect similar refactors to NodeTests
* Fix for live algorithm API testing
* Address Review
* Add underlying holdings to regression result handler details log
When debugging option exercise/assignment issues it's useful to see the
underlying holdings at the time the option contract fill event is processed.
Also adds the full symbol string to the top of the order event section.
The Symbol.Value was being logged via OrderEvent.ToString(), but it wasn't
the full SecurityIdentifier - by including the full SID string it makes it
easier to correlate fills over symbol rename boundaries.
* Fix automatic option assignment from market simulation
During the recent OptionExerciseOrder.Quantity refactor, this case was missed.
Additionally, it was realized that there were no regression tests covering the
automatic assignment via the market conditions simulation. This change introduces
a regression algorithm that covers the automatic assignment of put/call options.
* Update BasicOptionAssignmentSimulation._rand to be non-static
If this value is static then we reuse the same Random instance for ALL regression
tests, thereby defeating the purpose of using a well known seed number. This means
we get different results based on the order execution of preceding algorithms.
By making this an instance variable each algorithm will start with the same seed
value, ensuring consistent runs between regression tests, either run as a suite or
running a single algorithm in isolation.
* Do not update price scale for fillforward data
- Do no update price scale for fill forward data. FillForward data
should keep using the prev scale for which it was created. Adding unit tests
- When cloning do not lose IsFillForward flag state, affects
QuoteBars/Ticks, does not affect TradeBars since they perform a memberwise clone.
Adding unit tests
* Auxiliaries shouldn't really affect on applied price factor scale.
Despite we can receeive FillForward'ed data points, corresponding
Auxiliaries for them are not FillForward so we do meet the condition
and then refresh price factor. As a result all futher FF data points are scaled too.
* Regression algorithm to check that FillForward'ed data points arrived with last real price factor
* Add trade for regression algorithm
- Minot tweaks and adding trade for new regression algorithm.
- Updating AddOptionContractExpiresRegressionAlgorithm because it is
using the symbol for which new data was added.
Co-authored-by: Adalyat Nazirov <aenazirov@gmail.com>
* OptionChain and OptionContract improvements
- QCAlgorithm.AddUniverse will return the added Universe instance.
- Adding new OptionChainedUniverseSelectionModel will monitor a Universe changes
and will spwan new OptionChainUniverse from it's selections. Adding
regression test Py/C#.
- Adding new OptionContractUniverse that will own option contracts and
their underlying symbol. Adding regression test
- Fix double notification for security changes, bug seen in updated
UniverseSelectionRegressionAlgorithm
- Remove UniverseSelection special handling for Option and Future chains
- Fix DataManager not removing SubscriptionDataConfigs for Subscriptions
which finished before being removed from the universe
- Refactor detection of user added Universe so that they do not get
removed after calling the UniverseSelectionModel
* Add check for option underlying price is set
* Address reviews
- Adding python regression algorithm for
`AddOptionContractFromUniverseRegressionAlgorithm`
and `AddOptionContractExpiresRegressionAlgorithm`
- Rename QCAlgorithm new api method to `AddChainedOptionUniverse`
* Fix universe refresh bug
- Fix bug where a universe selection refresh would cause option or
future chain universes from being removed. Adding regression algorithm
reproducing the issue.
* Rename new option universe Algorithm API method
- Rename new option universe Algorith API method from
AddChainedOptionUniverse to AddUniverseOptions
- Rebase and update regression test order hash because of
option expiration message changed
* Add property IBrokerage.AccountBaseCurrency
* Set AccountCurrency to brokerage AccountBaseCurrency
* Remove USD AccountCurrency check
* Fix Oanda account base currency
* Fix currency symbol in CashBook.ToString()
* Fix unit tests
* Address review
* Add DebugMessage when changing account currency
* Add debug message for brokerage account base currency
* Fix currency symbol in equity chart and runtime statistics
* Update unit tests
* Improve information tracked in regression's {algorithm}.{lang}.details.log
The details.log file aims at providing a diff-able document that quickly and
easily provides actionable information. Since many regression algorithms use
the algorithm's debug/error messaging facilities to log various pieces of algo
state. This document also support a configuration option: regression-high-fidelity-logging'
that logs EVERY piece of data, again, with the aim of providing an easily diff-able
documenbt to quickly highlight actionable information. I may have missed omse key
pieces of information here, but now that the entire QC knows about this regression
tool, if additional information is required then hopefully it's easy enough at this
point to extend the RegressionResultHandler to suit our needs.
The RegressionResultHandler was initially implemented to provide a concise log of
all orders. This was achieved by simply using the Order.ToString method. While
testing/investigating OptionExerciseOrder behavior, it became evident that more
information was required to properly identify the source of potential failures or
differences between previous regression test runs. This change adds logging for
almost every IResultHandler method and additionally attempts to capture the
actual portfolio impact of every OrderEvent. This is accomplished by logging
the portfolio's TotalPortfolioValue, Cash properties and the security's
SecurityHolding.Quantity property.
This change also standardizes the timestamps used to folloow the ISO-8601 format.
When using the RegressionResultHandler, it is highly recommeded to also disable
'forward-console-message' configuration option to ensure algorithm Debug/Error
message logging is done synchronously to ensure correct ordering with respect to
log messages via Log.Debug/Trace/Error.
* Fix typo in options OrderTests test case name
* Update SymbolRepresentation.GenerationOptionTickerOSI to extension method
Far more convenient as an extension method
* Improve R# default code formatting rules
Many of these rule changes focus on improving the readability of code,
with a particular emphasis on multi-line constructs, chained method calls
and multi-line method invocations/declarations.
* Add braces, use string interpolation and limit long lines
* Refactor OptionExerciseOrder.Quantity to indicate change in #contracts
For all other order types, the Order.Quantity indicates the change in the algorithm's
holdings upon order execution for the order's symbol. For OptionExerciseOrder, this
convention was broken. It appears as though only exercise was initially implemented,
in which case only long positions were supported and a code comment indicated that
only positive values of quantity were acceptable, indicating the number of contracts
to exercise. At a later date, assignment simulation was added and utilized a negative
order quantity. This caused some major inconsistencies in how models view exercise
orders compared to all other order types. This change brings OptionExerciseOrder.Quantity
into alignment with the other order types by making it represent the change in holdings
quantity upon order execution.
This change was originally going to be much larger, but in order to minimize risks and to
make for an easier review experience, the additional changes will be committed separately
and pushed in their own PR. Some of the issues identified include:
* Manual Exercise (especially for OTM) is not covered
* Margin Calculations (in particular taking into account opposing contracts held)
* IBrokerage.OptionPositionAssigned is raised for exercise (later filtered by tx handler)
Fixes OptionPortfolioModelTests to use exercise model to properly model exercise of
non-account quote currency option contract.
* Include Order.Tag/OrderEvent.Message in their ToString, Fix default tag values
There was inconsistencies in what we were checking for. The order constructors
default the tag parameter to an empty string but Order.CreateOrder checks for
a null string. Additionally, the order constructors (limit,stopmarket,stoplimit)
would check for an empty string and if so, apply a default order tag.
This change cleans these checks up using string.IsNullOrEmpty and also removes the
check from Order.CreateOrder since we're passing the tag into the various order
constructors.
* Improve information tracked in regression's {algorithm}.{lang}.details.log
The details.log file aims at providing a diff-able document that quickly and
easily provides actionable information. Since many regression algorithms use
the algorithm's debug/error messaging facilities to log various pieces of algo
state. This document also support a configuration option: regression-high-fidelity-logging'
that logs EVERY piece of data, again, with the aim of providing an easily diff-able
documenbt to quickly highlight actionable information. I may have missed omse key
pieces of information here, but now that the entire QC knows about this regression
tool, if additional information is required then hopefully it's easy enough at this
point to extend the RegressionResultHandler to suit our needs.
The RegressionResultHandler was initially implemented to provide a concise log of
all orders. This was achieved by simply using the Order.ToString method. While
testing/investigating OptionExerciseOrder behavior, it became evident that more
information was required to properly identify the source of potential failures or
differences between previous regression test runs. This change adds logging for
almost every IResultHandler method and additionally attempts to capture the
actual portfolio impact of every OrderEvent. This is accomplished by logging
the portfolio's TotalPortfolioValue, Cash properties and the security's
SecurityHolding.Quantity property.
This change also standardizes the timestamps used to folloow the ISO-8601 format.
When using the RegressionResultHandler, it is highly recommeded to also disable
'forward-console-message' configuration option to ensure algorithm Debug/Error
message logging is done synchronously to ensure correct ordering with respect to
log messages via Log.Debug/Trace/Error.
* Fix typo in options OrderTests test case name
* Update SymbolRepresentation.GenerationOptionTickerOSI to extension method
Far more convenient as an extension method
* Improve R# default code formatting rules
Many of these rule changes focus on improving the readability of code,
with a particular emphasis on multi-line constructs, chained method calls
and multi-line method invocations/declarations.
* Add braces, use string interpolation and limit long lines
* Refactor OptionExerciseOrder.Quantity to indicate change in #contracts
For all other order types, the Order.Quantity indicates the change in the algorithm's
holdings upon order execution for the order's symbol. For OptionExerciseOrder, this
convention was broken. It appears as though only exercise was initially implemented,
in which case only long positions were supported and a code comment indicated that
only positive values of quantity were acceptable, indicating the number of contracts
to exercise. At a later date, assignment simulation was added and utilized a negative
order quantity. This caused some major inconsistencies in how models view exercise
orders compared to all other order types. This change brings OptionExerciseOrder.Quantity
into alignment with the other order types by making it represent the change in holdings
quantity upon order execution.
This change was originally going to be much larger, but in order to minimize risks and to
make for an easier review experience, the additional changes will be committed separately
and pushed in their own PR. Some of the issues identified include:
* Manual Exercise (especially for OTM) is not covered
* Margin Calculations (in particular taking into account opposing contracts held)
* IBrokerage.OptionPositionAssigned is raised for exercise (later filtered by tx handler)
Fixes OptionPortfolioModelTests to use exercise model to properly model exercise of
non-account quote currency option contract.
* Adds new unit tests covering changes and testing for old case
* JsonConvert.SerializeObject would convert a `null` value into a
literal string of "null" when writing to a file via the ToLine
abstract method. We opt for an empty string whenever the underlying
value is null so that the parsing works correctly later in the
data loading cycle.
* avoid preliminary typing command lines arguments
* unit tests: Config.Get can parse and cast values
* unit tests: parse command line args and return string values
* unit test: parameter attribute converter
* merge&parse unit test
- LocalObjectStore.Delete() will also delete file from the local object
store path if present, this will avoid the issue where restarting the
object store will re load the same deleted file. Adding unit test.
Issue https://github.com/QuantConnect/Lean/issues/4811
* Fixes issue where BidPrice/AskPrice were not adjusted for Quote Ticks
* Previously, ticks would have their prices (Tick.Value) adjusted whenever
TickType == TickType.Quote, but would not have their
BidPrice/AskPrice fields adjusted, thus potentially being orders
of magnitude such as 4x from the actual Bid/Ask prices.
This commit applies the pricing scaling factor in a critical
path where Ticks are adjusted to their scaled price. This issue
only applied to Resolution.Tick && SecurityType.Equity data.
* Refactors Extensions Tick Scale extension method
* Adjusts unit test to dispose of resources and assert history count
* Replaces use of FileSystemDataFeed for NullDataFeed in Adjustment test
* Adds regression algorithm testing BidPrice & AskPrice adjustment
* Address review: remove SecurityType check on TickType.Trade adjustments
* Append the full stacktrace to the algorithm loading exception message.
* Remove exception message loader duplication
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
- `HistoryRequestFactory` will not sure extended market hours for hour
resolution when determining the start time using quantity of bars.
Adding regression test
* Improves stability and fixes various bugs
* Adds unit tests covering changes
* Adds COVID-19 crisis plots
* Adjusts styling of crisis plots for more pleasant viewing
* Fixes bug where null BacktestResult caused application to crash
* Order JSON bug fixes and stability improvements
* MaxDrawdownReportElement now produces results for Live
* Replaced Estimated Capacity w/ Days Live
* Added Live marker to sharpe ratio
* Added support for MOO and MOC orders in PortfolioLooper
* Address review: adds new unit tests and cleans up code
* Bug fix: use LastFillTime instead of Order.Time for MOO and MOC
* Address review: Fixes tests and cleans up code
* Binance Brokerage skeleton
* Market hours
* Implement Symbol Mapper
- known symbols available on /api/v1/exchangeInfo
- fiat currencies are pegged
* Implement GetCashBalance
* Implement GetAccountHoldings
- there are no pre-existing currency swaps
- cash balances are pulled and stored in the cashbook
* Implement GetOpenOrders
* Manage orders: PlaceOrder
* Manage orders: UpdateOrder
Update operation is not supported
* Manage orders: CancelOrder
* Messaging: order book
* Messaging: trades
* Messaging: combine streams
- connect to fake /ws/open channel on init
- case by channel name, but not event type
* Messaging: order depth updates
- ticker symbol is not enough as it pushes updates only once a second, this would be a very incomplete data stream
- fetch ticker snapshot if lastUpdateId == 0
- follow Binance instructions for keeping local orderbook fresh
* Messaging: user data streaming
- Request userDataStream endpoint to get listenKey
- keep listenkey alive
- handle order close event
- handle order fill event
* DataDownloader: get history
- we can aggregate minute candles for higher resolutions
* fix data stream
* Tests: FeeModel tests
* Tests: base brokerage tests
* Tests: download history
* Tests: symbol mapper
* Support StopLimit andd StopMarket orders
* StopMarket orders disabled
Take profit and Stop loss orders are not supported for any symbols (tested with BTCUSDT, ETHUSDT)
* Tests: StopLimit order
* Tests: crypto parsing
* Reissue user data listen key
* comment custom currency limitation
* rework websocket connections
* implement delayed subscription
* adapt ignore message
* add license banner
* use better suited exception type
* avoid message double parsing
* support custom fee values
* extract BinanceApiClient to manage the request/response between lean and binance
* use api events to invoke brokerage events
* do not allow to terminate session if it wasn't allocated.
* update binance exchange info
* tool to add or update binance exchange info
* ExchangeInfo basic test
* Rebase + Resharp
* Binance brokerage updates
- Fix sign bug in sell order fills
- Fix bug in GetHistory
- Remove duplicate symbol from symbol properties db
* Remove unused code
* Revert removal of account currency check
* Update symbols properties database
* Address review
* Address review
- Upgrade API endpoints from v1 to v3
- Updated sub/unsub for new subscription manager
- Subscribe best bid/ask quotes instead of full order book
- Added handling of websocket error messages
- Cleanup + refactor
* Update symbol properties database
* Remove list from symbol mapper
* Fix symbol mapper tests
* Address review
- Fix resubscribe after reconnect
- Fix quote tick edge case
* Fix EnsureCurrencyDataFeed for non-tradeable currencies
* Fix check in EnsureCurrencyDataFeed
* Reuse base class subscribe on reconnect
Co-authored-by: Adalyat Nazirov <aenazirov@gmail.com>
Co-authored-by: Martin-Molinero <martin@quantconnect.com>
* Add OrderRight.GetExerciseDirection(isShort) extension
Returns the OrderDirection resulting from exercise/assignment of a particular
option right
See: BUG #4731
* Fix option exercise/assignment order tags and order event messages
The algorithm manager was doing work to determine whether or not the option ended
in exercise or assignment at expiration. This decision should be left for the exercise
model to decide -- from the algorithm manager's perspective, all that matters is that
the option was expired. The DefaultExerciseModel was updated to properly track whether
the option expired with automatic assignment or exercise, dependending on whether or
not we wrote or bought the option (held liability or right, respectively). Updated unit
tests to check for order event counts and order event messages for option exercise cases.
Fixes: #4731
* Fix typo in algorithm documentation
* Update regression tests order hash
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
In OptionChainUniverseDataCollectionEnumerator.IsValid we check the underlying
foir null and ensure that there are data points but we forget to check the actual
collection object for null, which it is initialized to. The collection remains
null in the case where no data is found.
* Convert a CoarseFundamental data point into a row
- Convert a CoarseFundamental data point into a row. Adding unit tests
* Remove fixed PriceFactor format
* Simplify BaseWebSocketsBrokerage reconnection
- Gdax does not emit time pulse so when no subscription was present the
DefaultConnectionHandler would trigger a reconnect. Replacing for
directly resubscribing on reconnect.
- TPV == 0 will send warning message instead of being an exception
* Fix unit test
* regression tests
* fix: apply the same time convertion to history request time as for data time
* ver2
* fixup
* unit tests
* do not need this conversion because RoundDownInTimeZone returns in proper TZ
* comment
* requested changes
* refactoring
* more refactoring
* fix existing test: should return Sunday if open
* more symbols
* fix existing tests: submit new btcusd data
* fix
* add Cfd symbol
* Fix regression *.{lang}.details.logs
Regression tests produce syslogs and a details.logs file. The details log file
provides a mechanism for logging data that passes through the result handler,
such as order event data. A change in the initialization ordeer of components in
the engine cause the algorithm id to not be set yet. The base result handler's
AlgorithmId property is populated via the job and is available, so we use that
instance.
* Update RegressionResultHandler.cs
Co-authored-by: Martin-Molinero <martin@quantconnect.com>
* Add DataProviderEventArgs base class for IDataProviderEvents event args
This base class includes a Symbol property. This will empower event listeners
to make decisions based on which security (symbol) raised the event. The immediate
use case is preventing multiple numerical precision messages for the same security.
This pattern can equally be applied to other error messages that are raised each
time a security is added to a universe.
See: #BUG-4722
* Update ConcurrentSet.Add to use ISet<T>.Add returning bool
It's a very common pattern to use if (set.Add(item)) which is enabled
via bool ISet<T>.Add(item) but not enabled via void ICollectiont<T>.Add(item).
This change simples changes the default Add implementation to use the ISet<T>
overload and relegates the ICollection<T>.Add implementation to be explicit.
See: #BUG-4722
* Prevent multiple numerical precision messages for same symbol
If a security is continually added/removed from a universe, then the user will
see this message each time the security is added. This results in some spam.
This change simply remembers for which symbols we've notified the user about the
numerical precision issue.
Fixes: #BUG-4722
* Fix for BaseWebSocketsBrokerage reconnection
- `BaseWebsocketsBrokerage` will handle reconnection using the
existing `DefaultConnectionHandler` to avoid code duplication.
* Fix for Bitfinex failed Subscription calls
* Fix for bitfinex orderbook
* Fix unit test
* SafeDecimalCast Throws Exception For Non-Finite Numbers
* Fixes Arithmetic Overflow Exception in QCAlgorithm.Trading Methods
Replace decimal cast for `SafeDecimalCast()`.
If the algorithm uses a non-finite number in QCAlgorithm trading methods, it will throw with an user-frieldly exception message.
* Fixes KellyCriterionProbabilityValue Calculation
* Add unit tests
* Refactor Py and create C# function
* Update readme to include local
* Refactor solution; fix python cases
* Update tests
* Don't accept null selector for python; Create SelectedData class
* Fix Testing
* Pre review
* Fix tests for Travis
* Test fix V2
* Test fix V3
* Refactor quantbook and fix tests
* Sort list by date
* Move ConvertToSymbols to Python Util
* Address review
* Order dataframe columns by Security ID
* Address review V2
* header for PythonUtilTests
* Oanda default forex Market
- Use Oanda as default forex Market since it has more pairs.
- Remove FXCM data add Oanda equivalente data.
- Update unit and regression tests
* Address reviews
- Revert FXCM data removal
- Remove unrequired commented code
* Fix rebase
* Remove invalid symbols from symbol properties db
* Add SymbolPropertiesDatabase.GetSymbolPropertiesList
* Remove symbol list in BitfinexSymbolMapper
* In EnsureCurrencyDataFeed fetch symbols from symbol properties database
* Remove symbol list in OandaSymbolMapper
* Remove unused code
* Address review
- Remove StringComparer.OrdinalIgnoreCase usage
- Rename KnownSymbolStrings to KnownTickers
- Fix Slice.Get OpenInterest type. Adding unit test
- Fix for SecurityCache that wasn't storing OpenInterest types
- Updateing regression tests to covere these usages
* Upgrade Bitfinex brokerage to API v2
* Fix rebase
* Address review
- use te instead of tu messages for trades
- add missing orderMap removals
- ClientOrderId is now time-based instead of a counter
- minor cleanup
* Trigger build
* Calculate both raw and adjuasted prices for backtesting
* disable second price factoring
* move and reuse method
* test coverage for new methods
* reuse scaling method
* reuse subscriptionData.Create method
* removed unused code
* regression test
* switch to aapl
* fix regression test output
* more asserts
* fix comments - reduce shortcuts and abbrevation
* more comments
* merge parameters
* reduce number of getting price factors
* fix tests
* fix tests
* fix regression tests
* calculate TotalReturn on demand
* include TotalReturn calculations
* perf tuning
* more unit tests for SubscriptionData.Create
* simplify things - store and return only raw and precalculated data
* fix regression tests; change it back
* factor equals 1 for Raw data
* small changes
* follow code style
* implement backward compatibility
* GDAX Brokerage updates
- Replaced fill detection from trade stream with monitor task
- Order fees for fills are now the real fees paid (previously they were calculated by the brokerage model)
- All unit and integration tests are green
* Address review
- Remove unnecessary signals
- Add "gdax-fill-monitor-timeout" config setting
* Remove user channel
* Remove IDataQueueHandler from AlpacaBrokerage
- a new IDataQueueHandler implementation for Polygon.io will be added to the ToolBox
* Fix Alpaca Websocket connect not waiting for completion
* Add security type check in Alpaca history
* Fix merge
* Remove aggregator from AlpacaBrokerage
* Add set market price during extended market hours
- Set market prices during extended market hours for live trading.
Adding unit test
* Add assert on internal data count
* Support List and OptionFilterUniverse for Py filter
* Regression algorithm for testing
* Unit Tests
* Fix for process
* Tighten filters to reduce load on automated testing
* Address review v2
* Fix key not found exception at InternalSubsManager
- Fix key not found exception
- Fix backtesting chart sampling
* Add comment about PreviousUtcSampleTime
* Add internal subscription manager
- Add InternalSubscriptionManager that will handle internal
Subscription. Replaces the realtime updates
- Fix thread race condition in the TimeTriggeredUniverseSubscription, we
have one thread injecting data points, the main algorithm thread, and
the base exchange is pulling from it
- Fixes for FakeDataQueue
- Adding unit tests
* Address reviews and fixes
- Internal subscription will use extended market hours
- Only sample charts accordingly
- Get api-url once
- `UniverseSelection` and `Research` will always use
`AlgorithmHandler.DataProvider` instance instead of
`DefaultDataProvider`
- Remove Compression at test/app.config since the dll isn't required
- Add missing license header
* Add PolygonDataQueueHandler
* Add history provider and downloader for Polygon
* Add aggregator to PolygonDataQueueHandler
* Address review
- Removed duplication in message classes
- Added public Subscribe/Unsubscribe methods in PolygonWebSocketClientWrapper
- Added history requests for Forex and Crypto
* Address review
- Add security type and market arguments to downloader
- Fix time zone bug in downloader
* Remove unnecessary locks
* Add Polygon history for all resolutions
- Equity: trades and quotes
- Forex: quotes only
- Crypto: trades only
* DataConsolidator Wrapper for Python Consolidators
* Regression Unit Test
* Refactor Regression test
* Bad test fix
* pre review
* self review
* Add RegisterIndicator for Python Consolidator
* Python base class for consolidators
* Modify regression algo to register indicator
* unit test - attach event
* Test fix
* Fix test python imports
* Add license header file and null check
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* Refactors SecurityCache to Support Access to Tick of Different Type
Adds two read-only lists of BaseData to save the last list of Tick of TickType.Trade and TickType.Quote. Changes methods accordingly to get and set these lists.
* Addresses Peer Review
The assignment of `_lastData` is moved to keep the current behavior of not beeing set with fill forward data.
`Reset` and `ShareTypeCacheInstance` implements the new class objects.
Adds unit tests for `Reset` and `ShareTypeCacheInstance`.
* Only Cache Non Fill-Forward Data
Undo changes that made AddData cache non fill-forward data.
* Create EaseOfMovementValue.cs
Added ease of movement file
* Update EaseOfMovementValue.cs
Added calculation for EMV and return its value
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
Rearranged code and removed all IndicatorBases
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
Added Min and Max Indicator
* Added Tests and Compile
* Fixed Bugs and Removed Reset
* Added Current Value and revereted to Bar data
* Fixed test file and refined indicator file
* TradeBar to IBaseDataBar
* Bug fixes
* bug fix
* Switching to TradeBar and attempting to fix Volume bug
There are two bugs that I have been having trouble fixing. 1. Cannot implicitly convert decimal to int (simple fix but cannot find where bug is taking place)
2. IBaseDataBar does not contain a definition for Volume
* Update EaseOfMovementValueTests.cs
* bug fix
* added data
* updated assertion
* added reset
* Update EaseOfMovementValueTests.cs
* Update EaseOfMovementValue.cs
* Update spy_emv.txt
I had the wrong test data in. Was throwing failed test for many pull requests.
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Cleaned Data
* Bug fixes
Fixed zero division error. Used better Test Data.
* removed readonly from _previous...price
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* bug fix
* Test Bug Fix
* EMV data from online
* Cosmetics
* Out of bounds fix
* Update EaseOfMovementValueTests.cs
* Update spy_emv.txt
* Update spy_emv.txt
* Added changes requested
Placed constructor first, fixed nullable type if statement, set 10,000 to default argument, added SMA.
* Update EaseOfMovementValue.cs
added variables
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Update EaseOfMovementValue.cs
* Fixed bugs
* Changed Delta, Added Assert
Create Indicator -> Update EMV -> Assert Status. Also changed delta from 1 to 0.000001 to improve test accuracy.
* Added unit test testing the SMA
* Add empty cashbook check in BrokerageSetupHandler
* Update check for zero TotalPortfolioValue
* Move check for zero TPV after setting currency conversions
* Fix zero conversion rate for account currency + add unit tests
* Fix unit test
* Live Coarse universe refactor
- Live trading will source Coarse and Fine fundamental data directly
from disk. Updating unit tests.
* Adds ILiveDataProvider interface
* Adds wrapper for IDataQueueHandler implementations
* Replaces IDataQueueHandler with ILiveDataProvider in
LiveTradingDataFeed
* Edits IDataQueueHandler documentation
* Maintains aggregation for current IDQH impls and skips for ILDF impls
* Note: No unit test was created for this method, go back and TODO
* Protobuf Market data
- Adding protobuf support for Ticks, TradeBars and QuoteBars. Adding
unit tests.
* Adds unit tests for LiveDataAggregator changes
* Fixes bug where custom data was not handled as it was before
* Fixes race condition bug because of variable reuse in class
* Add protobuf extension serialization
* Fixes for protobuf serialization
* Refactor
* Fix OptionChainUniverse
* replace BaseDataExchange pumping ticks with consolidators
* AlpacaBrokerage
* BitfinexBrokerage
* GDAXBrokerage
* OandaBrokerage
* InteractiveBrokers
* TradierBrokerage
* FxcmBrokerage
* PaperBrokerage
* etc
* WIP fixes for existing LTDF unit tests
* Fixes more LTDF unit tests
* make IDataAggregator.Update recieving Generic BaseData rather than Tick
* Change IDataQueueHandler.Subscribe method
* Some fixes after adding new commits
* Adds protobuf (de)serialization support for Dividend and Split
* Serialize protobuf with length prefix
* Fix missing LTDF unit tests
* Adds TiingoNews protobuf definitions
* fix comments
* more fixes on IQFeedDataQueueHandler
* disallow putting ticks into enumerator directly
* ScannableEnumerator tests
* fix OandaBrokerage
* AggregationManager unit tests
* fix AlpacaBrokerage tests
* fix InteractiveBrokers
* fix FxcmBrokerage tests
* call AggregationManager.Remove method on unsubscribe
* fix GDAX existing tests
* Fixes, refactor adding more tests for AggregatorManager
* Adds BenzingaNews protobuf definitions and round trip unit test
* Adds missing TiingoNews unit test to Protobuf round trip tests
* Improve sleep sequence of LiveSynchronizer
* need start aggregating first, and then can subscribe
* More test fixes and refactor
- Refactoring AggregationManager and ScannableEnumerator so the last is
the one that owns the consolidator
- Adding pulse on the main LiveSynchronizer
* Improve performance of LEquityDataSynchronizingEnu
* Add missing Set job packet method
* Minor performance improvements
* Improvements add test timeout
- Improvements adding test timeout to find blocking test in travis
* Improve aggregationManager performance
* Testing improvements for travis
* Remove test timeouts
* More test fixes
- Adding more missing dispose calls and improving determinism
* fix IEXDataQueueHandler and tests
* Final tweaks to LTDF tests
* more AggregationManager tests
* consume and log ticks
* fix test: couldn't subscribe to Forex tickers
* change Resolution for all bar configs
* Improve RealTimeScheduleEventServiceAccuracy
* refactoring: move common code to base class
* fixed bug; unsubscribe SubscriptionDataConfig
* Small performance improvement
* Minor fixes
* Avoid Symbol serialization
* Fixes coarse selection in live mode
* Fix for live coarse
* Adds protobuf (de)serialization support for Robintrack
* Adds round-trip unit test
* Minor performance improvements
* More minor performance improvements
* pass LiveNodePacket through to OandaBrokerage
* Fixes empty list becoming null value when deserializing with protobuf
* Reverts BZ live trading exception removal and fixes tests
* Refactor WorkQueue making it abstract
* Add try catch for composer
* Adds optional data batching period to LiveFillForwardEnumerator
* Override data-queue-handler with config
* Improve PeriodCountConsolidator.Scan performance
* Move batching delay to main Synchornizer thread
* Reverts addition of Robintrack protobuf definitions
* Give priority to config history provider if set
* Add Estimize protobuffing
- Add Estimize protobuffing support. Adding unit tests
* Always dispose of data queue handler
Co-authored-by: Gerardo Salazar <gsalaz9800@gmail.com>
Co-authored-by: Adalyat Nazirov <aenazirov@gmail.com>
* Allow LeanDataWriter to append to zip data files
* Use the data directory provided to the writer instead of the global value
* Disregard the time-portion of an input date
* Overwrite zip entries when creating futures data files
* Minor tweak and adding unit test
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
* Add research docker launch script
* Internalize creation of cfg and dir to dockerfile
* Implement bash version of notebook run script
* quick fix to use docker.cfg
* update readme
* typo fix
* Review tweaks
* Tweaks
* track previous input per symbol
* improve Arms Index period checks
* don't need to be thread safe due to consolidators update are sequential
* Use TryGetValue for performance
- Minor update for AdvanceDeclineIndicator to use TryGetValue to reduce
amount of dictionary access
Co-authored-by: Martin Molinero <martin.molinero1@gmail.com>
SmartInsiderIntention.FromRawData method doesn't read the raw data as they come from SmartInsider, it needs a filtering of some columns that is performed in SmartInsiderConverter.Process method.
That's added to the failing test.
- TickQuoteBarConsolidator will use previous bar. Updating unit tests
- QuoteBarConsolidator open ask and bid will match previous bar close
bid and ask. Adding unit tests
- Adding an example algorithm of a custom universe selection using
coarse data and adding tiingo news. If conditions are met will add the
underlying and trade it
- Adding required UniversePythonWrapper
`MinimumPriceVariation` values in `SymbolPropertiesDatabase` can be represented as a exponent number. `ToDecimal` method fails to parse it correctly. A new method, `ToDecimalAllowExponent` is used instead.
- Adding new DataPermissionManager that will own the existing
datachannelProvider. Will assert configurations before added to the
data feed. Adding unit tests
This example shows how to create an EMA cross algorithm for a futures' front contract. Once the contract is added, the indicators are registered to a new consolidator and warmed up with historical data. When a contract is removed, the consolidator is removed and the indicators are reseted. We don't need to liquidate it, because it's liquidated automatically since it has expired.
- Do not reuse parameter variables `start` and `end`. Create new variables with meaningful names and rename them to `startUtc` and `endUtc`.
- Use `EachTradeableDayInTimeZone` to calculate `tradableDates`.
- Adds another test/assertion in the regression algorithms to ensure tests in the scheduled event were performed.
The tradable days of the history request should respect the data time zone since the data source files also do.
Upgrade `BasicTemplateFuturesHistoryAlgorithm` to a regression algorithm and add a schedule event to test history requests every hour.
* The original implementation was made with the idea in mind that bars
have a time range that they encompass. When the `Time` property is
rounded down, the period added to `EndTime` ensures that the data
will not have look-ahead bias.
However, multiple data sources do not have a time range that the data
applies to, and is actually point-in-time. For these kinds of
data, rounding of the data is not preferred since they are supposed
to represent a point in time.
* When a BaseData instance has Nullable fields, the number of data
points per Series is inconsistent, and results in a Series with
a length different from the other Series we produce, resulting
in an error "ValueError: cannot handle a non-unique multi-index!"
when we were constructing the final DataFrame.
Use the original `pandas.Series` to create `Series` objects before `DataFrame` (wrapper version) creation.
It improves the speed because it avoids unnecessary and expensive index wrapping operations of the `DataFrame` creation.
* We now use the time at which Smart Insider added a given event to
their database. Because time indexes for Smart Insider are now
non-duplicated per symbol, the History request will properly work.
In this implementation, we dynamically create new classes that wraps key functions and properties. The wrappers will map/convert any parameter that are convertible to the string representation of Symbol.ID before they are used by the original function/property.
- Fix out of range exception, if the returns array is smaller for some
of the Symbols it will use double.NaN, similar behavior to when a date
isn't found for a Symbol. Adding unit test
- Fix duplicate key exception at ReturnsSymbolData.Returns, this could
happen with FF in history requests. Adding unit test
* We made an assumption that all tickers marked as "defunct" from
Estimize would have a dash '-' separating the defunct indicator
from the ticker. However, there was a ticker where the defunct
separator was actually an underscore '_'. There are now
additional checks in place to ensure that we do not pass the
Substring method an invalid 'length' value.
- Add a comment explaning why we need to round down in data time zone
and not exchange timezone.
- Keep subscription EndTime RoundDown by DataResolution since it won't
change and it's expensive.
- Replace Enum to string operations on Options enums, since it's
expensive
- Replace Enum IsDefined check since it uses reflection
- Improve OptionFilterUniverse linq operations
- Improve performance of SecurityIdentifier properties to avoid having
to extract them from properties always
- Avoid calling ToList for already list data collections (it creates a
copy)
- Avoid iterating over securities for which we have no holdings when
calculating TPV or MarginUsed
Methods: 'diff', 'div', 'divide', 'drop', 'drop_duplicates', 'droplevel', 'dropna', 'dtypes', 'duplicated'
`BackwardsCompatibilityDataFrame_binary_operator` replaces `BackwardsCompatibilityDataFrame_add` to handle all operations (more to be added in future commits)
- Refactor ZipDataCacheProvider to avoid value types thread issues, it
will now use a timer. Add missing dispose calls.
- Synchronizer won't share the `SubscriptionFrontierTimeProvider`
instance since it's not thread safe and shouldn't be called
- Fix BacktestingResultHandler `_daysProcessed` thread safety
- Fixing `RealTimeScheduleEventService` thread safety
`Remapper.__getitem__` was not returns a `Remapper` object when the result was `pandas.DataFrame`. It is needed for a sequence of `.loc.` calls.
Refactors `Remapper._self_mapper` to handle tuples where the key can be found in both first and second position.
Update unit tests that should test `Symbol` object as key, but were using `str(Symbol)`.
- Adding new ReferenceWrapper for structs, value types, to avoid thread
race conditions while reading and writting. In C# reference type
assignments are atomic, so it allows us avoid using locks.
- Add more fill forward unit test including a weekend using different
combinations of resolutions
- Add required AIG second data file
- Avoid list in most common FF case
- Add StreamReader Reader for Tick data type.
- Adding stream reader GetString extension, adding tests.
- BacktestingBrokerage will not create unnecessary order events
- AlgorithmPythonWrapper will directly call base OnFrameworkData()
implementation skipping going through python and it's overhead
- Small performance improvement for adding Tick data points into a Ticks
collection
- For python always wrap slice with PythonSlice, so that slice.Get()
works even when no custom data is present, adding test.
- Reduce ResultHandler amount of packets sent and their size.
- Only send order and orderEvents packet if there is actually data in them.
- Don't send Holdings fields if value is 0
- Don't send AlphaRuntimeStatistics fields if default values
- fixed websocket authentication failure for "user" channel
- removed "matches" channel, now using authenticated "user" channel for order fills
- send BrokerageMessageEvent as error for subscription and authentication failures
- Adding new ConcatEnumerator that will join enumerators together
secuentially. Adding unit tests
- SubscriptionDataReaderHistoryProvider will consume an optional
intraday enumerator that will be concatenated with the existing
SubscriptionDataReader
This basic algorithm implements a `CustomPartialFillModel` class that chnages the behavior of the `FillModel.MarketFill` to simulate partially fill orders.
- Algorithms logs of the same type will be batched together when sent
- Increasing backtesting update to 3 seconds same as LiveTrading
- Increasing insight sending interval from 1 second to 3
- Reducing code duplication for retrieving and sending algorithm logs
- Update regression algorithms stats after making SecurityCache ignore
QuoteBars for equity for OHCL values and GetLastData(). They were
affected since the `BenchmarkSecurity` used `.Price` which was QB for
equities. Order list hashes changed because SubmissionLastPrice will
now be TB instead of QB
- Adding new SerializedOrderEvent and SerializedOrder with new
JsonConverters
- Specifying OrderEvent json converter when storing, streaming data
- Adding unit tests
- Slice dynamic data accessors will ignore QuoteBars and
Ticks TickType.QuoteBars for equities. This is to make slice dynamic
accessors deterministic and avoid breaking backwards compatibility when
adding QuoteBars for equities.
`IndicatorBase.Update` use `input.EndTime` instead of `input.Time` so that `input.Current.Time` (and `EndTime`) matches the `input.EndTime` and achieve consistent behavior across different indicator types (`IndicatorDataPoint`, `IBaseDataBar` and `TradeBar`)
The stale price message should use `ToStringInvariant` with `price.EndTime` for string represenation across different cultures. The former behavior has impact on the order list hash calculation.
- Fixes:
- `Slice.GetValues`
Not returning the `Value` but the `KeyValuePair`
- `ExtendedDictionary.update(PyObject)`
Paramenter type should be `PyObject` not `PyDict`
- Adds unit test for `update()`
- Fixes `OrderListHash` of `PythonDictionaryFeatureRegressionAlgorithm` test.
Changes the return type of `keys()` and `values()` methods to `PyList`. Since `Keys` and `Values` return `ICollection` in some cases, the Python type was not consistently of `list` type.
`BaseDictionary` is an abstract implementation of `IExtendedDictionary` keyed by `Symbol` that implements Python `dict` methods. `Slice`, `DataDictionary`, `SecurityManager`, and `SecurityPortfolioManager` derives from it in order to behave like Python `dict`.
- Disable order event streaming in backtest
- Add new OrderEvent IsUpdate flag to be set by the different brokerage
implementations
- Update regression test stats after rebase
- Live and backtesting will send delta order events updates
- Live will store order events every 10 minutes per day
- Backtesting will store last 100 order events on every update and will
store all order events in the end of the backtest
- Fixing Python slice enumeration which would not happen when custom
data was present. Adding unit test
- Fix Slice constructor which would fail to create data collections due
to relying on non-deterministic 'slice._data'. Adding unit test
- Improving custom data retrieval for symbols which have more than 1
data type in a slice. Adding unit test
- First step refactoring slice internally, no behavior changed
- WarmupIndicator will be able to determine the correct type to use
- Fix bug in `History.GetMatchingSubscriptions()` which would use the
same TZ for exchange and data. Covered by regression algorithm.
- Consolidate will only infer `TickType` from `T` is not abstract
- Adding regression algorithm
- Slice will expose `Get(Type)` to get data by type, adding unit tests
- Remove unrequired symbol check
- Dispose of `ZipDataCacheProvider` used in unit tests.
To avoid issues with other tests since zip files could remain open.
- Make `GetSubscription` private and remove unrequired overload
- Add warning message when trying to warm up an indicator which does not
have a warm up period
- Fixes for C# RegisterIndicator API methods which were ignoring provided
type of T
- Fixes for Py RegisterIndicator API methods which was not using the
provided 'selector' method
- Adding C# and Py regression algorithm
- Add a couple of missing cases where we explicitly set tick type to
null and assert the default data type used
- Adding custom data consolidate for Py algorithm
* Adds new unit tests for BenzingaNewsJsonConverter
* Fixes issue in BenzingaNewsDataConverter where we would
write data in local time, resulting in non-UTC times from
the deserialization in BenzingaNews.Reader(...). Unit tests
were added for this issue as well
When new securities are added to the universe, the `ReturnsSymbolData` is warmed up with historical data that may not have the same timestamp causing an index mismatch that leads to a rejection to several valid data. In this case, we will assume that there is a time correspondence similar to what is done in Python. Unit test was added.
`BlackLittermanOptimizationPortfolioConstructionModel` will consider a new view only if there is a new last active insight by updating the `ReturnsSymbolData` with the `Insight.GeneratedTimeUtc` instead of the `IAlgorithm.Time`. Consequently, the timestamp of the historical data is converted to UTC for consistency.
`BlackLittermanSymbolData` now rejects duplicate keys like its C# version: `ReturnsSymbolData`.
Finally, `BlackLittermanPortfolioOptimizationFrameworkAlgorithm` statistics was updated because of the bug fixes.
- SymbolData at Slice will keep track of the types it has and GetData
will always return in the same order, TradeBars, QuoteBars, Tick, Custom
- Fixing slice.Get(type) for custom type which relied on insertion order
- Updating regression test
- Reduce MinimumVariancePortfolioOptimizar precision goal so that both
CSharp and Py MeanVarianceOptimizationFrameworkAlgorithm return the same
results
- Limit factor file dates in factor file generator unit test
Update test to the new crypto and equity subscriptions rules:
- Only consolidates trades
- Low resolution data are only trades.
Update and fix tests
Add missing minute sample files
Update Regression algorithms statistics
Crypto and Equities will consolidate trades by default
Daily and hourly resolution will return only trades for crypto and equities.
Update equity TAQ regression test
Add minute sample files
Updating regression algorithms
Update SpotMarket test cases
Add regression test for equity trades and quotes
- History request.
- Trades and quotes pumped into OnData.
- Subscriptions are added correctly.
Add sample data
Checks low resolution only subscribes to trade bars
Invert logic, start from map files and look for data in daily folders
Also generate all coarse in in memory and write all coarse files at once
Remove obsolete code
Parallelize reading and writing
* Adds a new JsonConverter implementation that handles null
values in the ChartPoint JSON representation. It skips any
value that is found to be "null" or unparsable
- MeanVarianceOptimizationPortfolioConstructionModel tests will use
`SingleEntryDataCacheProvider`, instead of `ZipDataCacheProvider`, so
that zip files are not cached into memory and block other usages
- Address review improve regression and unit test
- Fix same time scheduled event order
- Adding comments in unit test
- Increasing unit test sleep time
- Update logic in `Run` method will be executed by an internal Task
handled by the `BaseResultHandler`
- Call `DataFeed.Exit` even if algorithm initialization failed
- Removing this margin precheck entirely to avoid generating unrequired errors.
When the PortfolioConstructionModel is rebalancing and generating new targets,
the check won't allow targets that increase margin usage which should be able to execute,
since the execution model (the next step) will first execute orders which reduce margin usage.
Also, note that this check will later be performed by the brokerage when processing the order.
Implements `PortfolioBias` in EWPCM, CWPCM and IWPCM. With this new feature, these PCM will ignore insights that do not respect the desired bias. E.g. for `PortfolioBias.Long`, on Insights with `InsightDirection.Up` will be converted into `PortfolioTarget.Quantity` greater than zero and other `InsightDirection` will result in `PortfolioTarget.Quantity` of zero.
- Adding new unit test for python PCM implementations, asserting each
method is correctly called
- Reverting some unrequired changes in the
`MeanVarianceOptimizationFrameworkAlgorithm`
- Refactor shared logic from `EqualWeightingPortfolioConstructionModel`
into base `PortfolioConstructionModel` implementation
- `MeanVarianceOptimizationPortfolioConstructionModel` will respect
rebalancing period and will use all active inisights, not just the last
- Refactor AccumulativeInsightPortfolioConstructionModel to inherit from
the EWPCM, reducing code duplication and adding support for rebalancing
period
- Fixing bug where only 1 new insight per symbol was processed per loop
- Adding unit tests
- Adding new `Func<DateTime, DateTime?>` that allows PCM to return null
if the next rebalance time is null, in which case the function will be
called again in the next loop.
- Adjusting PCM next rebalance time check to perform rebalance once the
time is reached
- Adding new regression test. Updating existing
- Updates `DropboxUniverseSelectionAlgorithm` and `DropboxBaseDataUniverseSelectionAlgorithm` with new links to Dropbox files and date range to match the dates in the files.
- Adds copy of files in `TestData` folder.
Adds the following method overloads:
`Future.SetFilter(int, int)`
`Option.SetFilter(int, int, int, int)`
`FutureFilterUniverse.Expitarion(int, int)`
`OptionFilterUniverse.Expitarion(int, int)`
to simplify the API since the original methods accept `TimeSpan` that are most used with `AddDays(int)` method.
- Moving InsightCollection into base `PortfolioConstructionModel`
- Will call `InsightCollection.GetNextExpiryTime()` on each check, and
for performance `InsightCollection` will keep track of next insight
expiry time
- Removing need for PCM base classes having to call `RefreshRebalance`
- Some refactor clean up at base
PortfolioConstructionModel.IsRebalanceDue()
- Refactoring some PCM methods to be `protected` since they are not required
to be public
- Adding new `PortfolioConstructionModel.RebalanceOnInsightChanges`
flag, that will allow avoiding new insights or insight expirations to
trigger a rebalance
- Updating unit tests
- Fix for the MeanVarianceOptimizationPortfolioConstructionModel that
was skipping, in some cases, 0 magnitude insights
- Improving OrderSizing.Value and Volume to include code in consumers
- OrderSizing.GetUnorderedQuantity() will adjust result by lot size
- ImmediateExecutionModels will use OrderSizing.GetUnorderedQuantity()
- OrderSizing.Value() will take into account ContractMultiplier
- Adding unit tests
- Add missing PCM constructor methods for the different supported
rebalancing periods overloads
- Normalize rebalance behavior in the base `PortfolioConstructionModel`
- Adding new `PortfolioConstructionModel.RebalanceOnSecurityChanges`
that will allow disabling rebalance on security changes
- Adding unit tests
Sets the `Security` object `BuyingPowerModel` before its `SetLeverage` method is called in the `BrokerageModelSecurityInitializer`.
Updates `BitfinexBrokerageModelTests` to test the new behavior.
- `FutureMarginModel` and `PatternDayTradingMarginModel` will adjust
margin requirements before market closes using the new `Exchange.ClosingSoon` property
- Adding unit tests and regression algorithm
* Fixes bug in ToCsvData() where an empty final value would not be
parsed
* Removes GetNextCsv()
* Reworked `Reader` logic in TradingEconomicsCalenda
* Use delimiter var as separator in TradingEconomics.Calendar
* Convert country names to uppercase in TradingEconomics.Calendar
* Updates TradingEconomics algorithms to use new event definition
* Separated Calendar and Indicator definitions into partial class
* Refactors portions of TradingEconomicsCalendar
* Makes TradingEconomicsCalendar.GetSource return RemoteFile for live
* Fixes bugs in TradingEconomicsEventFilter
* Fixes bugs in StreamReaderExtensions (thanks Martin :))
* Adds new unit tests to cover changes
* Adds support for live algorithms using TE calendar events
* Modifies TradingEconomics tickers to include country
* Fixes bug where `TECal.Clone(...)` could result in non-deterministic
results
* Fixes bugs in TradingEconomicsEventFilter
* Adds new CSV parsing methods in StreamReaderExtensions
* Moves ToolBox/TradingEconomicsCalendarTests to Common tests folder
* Updates TradingEconomicsCalendar tests to test CSV parsing
* TradingEconomicsCalendarDownloader now writes files to disk as JSON
- Remove unrequired `GetBuyingPower`
- Making `BuyingPowerModel.GetMaintenanceMarginRequirement` protected
instead of public
- Adding `GetMaximumOrderQuantityForDeltaBuyingPower` to replace
public `GetMaintenanceMarginRequirement` and improve API experience for
consumers like the `DefaultMarginCallModel`
- Adding new unit tests
- Unit test started failing because it was using a future end time at
the time of the merge and once it passed the end time it started failing
because it performed 1 extra selection
Changes `GetLastKnownPrice` logic to retry to get non-null data after a first attempt. Previously, it would return null in the first attempt and illiquid securities would not have valid data to set its market price. In the second attempt, we increase the look-back period to the equivalent of three trading days worth of data.
- Remove unused `OptionChainUniverseDataCollectionAggregatorEnumerator`
- Refactor `BaseDataCollectionAggregatorEnumerator` and
`OptionChainUniverseDataCollectionEnumerator` to avoid emitting invalid
data points
- Updating regression tests statistics
- Adding unit test
event names
* Adds unit test for new filter method
NOTE: This is a breaking change for Trading Economics calendar events
property. Any previous string matching might result in a mismatch if
results were not previously being normalized.
- After https://github.com/QuantConnect/Lean/pull/4000
CoarseFineFundamentalRegressionAlgorithm started using `MarketCap`, but
this value was always 0 in existing data, so it caused undeterministic
results. Adding new data and update expected result.
- `AccumulativeInsightFrameworkAlgorithm` expected statistic were not
correct, updating.
* OnEndOfDayRegressionAlgorithm - Since the EndTime of the hourly benchmark is during the day,
the OnEndOfDay method gets called one less time than usual. Updates statistics
* CustomUniverseWithBenchmarkRegressionAlgorithm.cs - modified algorithm so
that it works with hourly benchmark. Previously only tested for Daily benchmark
* BasicTemplateAlgorithm.py - Modified resolution to be
Resolution.Minute, just like it is in C#
* CustomDataRegressionAlgorithm.py - Remove warmup call from Initialize
* IndicatorSuiteAlgorithm.py - Adds PythonQuandl import to fix import error
* Changed variable names of protected members in BaseResultHandler to
match existing variable naming convention
* Changed AlgorithmRunner return type
* Remove AlgorithmResults dictionary from AlgorithmRunner
* Create AlgorithmRunnerResults container class
* Modify Relative Sampling test to accept failure cases
* Misc. updates as a result of changing AlgorithmRunner return type
* Get rid of `previousTime` and use `time` instead in AlgorithmManager
* Refactor variable names in Backtesting and Live IResultHandler impls
* Moves shared variables to BaseResultHandler
* Modifies BacktestNodePacketTests statistics to get tests passing
* Adds new StatisticsBuilder tests
* Modifies BacktestingResultHandler tests to make them passing
- Regarding these tests, the decision was made to get them
passing so that if any behavior changes, we will know immediately.
Next commit will contain regression test changes for easy rollback.
* Removed Sample[a-zA-Z]+ methods from IResultHandler definition
* Converted Sample[a-zA-Z]+ methods from public to protected
* Updated inheritors of BaseResultHandler to use new accessibility
modifiers
* Removes useless code in ResolutionSwitchingAlgorithm
* Refactors AlgorithmManager loop
* Refactors StatisticsBuilder methods and strategy for series alignment
* Move sampling logic to the corresponding IResultHandler
* Changes benchmark resolution to Resolution.Hour
* Modifies IResultHandler to enable external sampling
* Adds BacktestResultHandler unit tests
* Adds ResolutionSwitchingAlgorithm to test misalignment
* Adds support to AlgorithmRunner to store algorithm IResultHandler
Warning: this commit breaks accurate calculations for algorithms that
only make use of `Daily` resolution data. Previously, because
the benchmark was added in Daily resolution in backtesting, any
algorithm that only made use of daily data would have an accurate
calculation for beta and various other statistics.
These changes serve to fix the statistics calculations of non-daily
resolution algorithms, with daily resolution to be revisited at a later
time.
- BacktestingResultHandler will send a maximum of 50 orders per update
packet and will check `LastFillTime` and `LastUpdateTime` too
- Fix invalid linked file
`AccumulativeInsightPortfolioRegressionAlgorithm`
All these flags are now being reset for consistency before restarting but the only one considered a bug fix is for `_previouslyInResetTime`.
This flag is normally set to `true` in `TryWaitForReconnect` after a disconnect but it was never being cleared after an IBGateway restart, causing an unnecessary reboot outside of the IB reset period on the next `TryWaitForReconnect` call (which in turn could also cause algorithm termination later on at cash sync time).
- `AddUniverse` call will add new Universe to the pending collection which
will be consumed at the `OnEndOfTimeStep` where it will be added to the
data feed, same as we do for the `UserDefinedUniverses`. This is
required since the start and end date, during initialize, is consumed by
these universe subscriptions.
-Adding unit test.
- Will sort the List is reverse order and remove items from the end.
- Add newWorkEvent so that workers are monitoring for new work while sleeping
- Add WorkAvailableEvent for each work queue so that workers can sleep
longer or until the sorting thread notifies them.
- The new scheduler will create a dedicated thread pool
- Work will be prioritized based on their weight and new items will have
highest priority. The weight will be determined by the size of the
subscription enqueueable.
- The work queue will be sorted and the weights updated by a dedicated
thread
- There will be a maximum weight value that will 'disable' a work item
until their weight comes down
- Consumer will not know about workers or anything alike anymore.
- Applying a general max work queue size of 400 items, this will reduce
CPU and RAM usage. Can be set by config.
The class field that tracks the current month is updated only if there are securities that passed the selection criteria. It prevents division by zero and allows the universe selection a new attempt on the next trading day while keeps the universe unchanged
Uses `MarketCap = Value * EarningReports.BasicAverageShares.ThreeMonths`. Since the previous calculations as using `BasicEPS` that can be negative, market cap got negative values which is not realistic.
Moves `MarketCap` code from the generated file to another one that shares the partial class.
- The AlpacaBrokerageModel requires a call to GetOpenOrders() to determine if a new order can be submitted and was not taking the current order into account (which is added to the open orders before CanSubmitOrder() is called).
Adds `MarketCap` member to `FineFundamental` class that represents the aggregate market value of a company represented in dollar amount.
Changes `CoarseFineFundamentalRegressionAlgorithm` (C# and Python) to select securities based in its market capitalization. Same result as selecting by P/E ratio.
- Reduce code duplication between live and backtesting
- `LiveTradingResultHandler` will store logs progressively, appending to
the file
- `BacktestingResultHandler` stores logs at end
Since extension methods don't play well with pythonnet, this change converts
the extensions class into a decorator class. Additionally, this ObjectStore
type is the type that gets exposed via QCAlgorithm so users can access these
methods directly without requiring the use of extension methods.
This approach has many good properties. For one, it doesn't force implementors
of IObjectStore to use a base class. Second, it maintains healthy separation of
API level concerns (such as convenient methods) from the abstraction level conerns
of IObjectStore. Setting it up in this way ensures ANY implementation of IObjectStore
will still get access to these additional methods. Another thing to note is this
prevents using a base class on QCAlgorithm's public interface. Instead, we have a
specific type that is dedicated to fulfilling API level requirements, which also
provides us flexibility in the event the API needs to be updated. If it were a subclass,
you run the risk of breaking the implementors of the subclass.
- When BacktestNodePacket has the inital `CashAmount` set we will clear
all existing cash amounts and set the account currency
- Adding more unit tests
* Deletes NullAlphaHandler, NullLeanManager, NullSynchronizer
* Calculate the backtest and live PointInTimePortfolios only once now
* Refactor Metrics calculations
* Add missing license headers to some files
* Reverts accessibility of AddToUserDefinedAlgorithm to private
* Other misc. fixes and cleanup
- Fix for `DataQueueOptionChainUniverseDataColletionEnumerator`. Last
emit time should rely on the actual last option chain symbol lookup not
on the underlying end time.
- Adjust test so that they fail in `master`
- Only add OnEndOfDay ScheduledEvent if the algorithm implements the
method. Adding unit tests
- Avoid creating a new baseData instance at
`SubscriptionDataSourceReader`
- Adding static `FineFundamental` instance since creating new ones is
expensive
* Can handle null Result packets
* Created utility files
* Added various helper methods to PortfolioLooper
* Fixes build issue by removing System.Collections.Immutable
* Updates plots to show "Insufficient data" when it can't be created
* Hides empty crisis page
* Fixes wkhtmltopdf display bug
* Removes Calculations.cs
* Modifies accessibility of AddToUserDefinedUniverse in QCAlgorithm
* Add null value handling in OrderJsonConverter
* Various bug fixes
* Fixes broken ReportChartTests.py
* Adds leverage to PointInTimePortfolio
We now will calculate all of the metrics needed to directly plot the
data in C# instead of doing calculations in Python.
This has added benefits in that the code can be easily reused to generate
plots from Result packets and makes the underlying plotting library we
use more extensible and replaceable.
* Adds Calculations.cs
* Refactors some of ReportCharts.py
* Adds Drawdown classes
* Adds Deedle and MathNet.Numerics packages
* Completes calcuations in C# side instead of Python
Side note: I think it's a little funny that we were previously
calculating the plots via Pandas, which uses a C backend. Effectively, we
were transferring data between at least 2 separate FFI
boundaries: C# -> Py -> C -> ? (whatever BLAS is written in)
Shows how to read/save object store entires. In this case, it shows a
use case where a potentially time intensive operation's result is saved
in the object store and on subsequent runs the result is pulled directly
from the object store to enable faster run times
Errors raised during persistence aren't able to be handled by user code,
and in fact, are swallowed by the implementation after being logged. By
exposing these errors as events we allow the algorithm to be notified of
such an error and take any step necessary to handle the persistence error.
There's no reason for algorithms to have direct access to this implementation.
Moving this into the engine prevents algoriths from directly accessing LocalObjectStore
and instead can only reference it through the IObjectStore abstraction
persistenceIntervalSeconds defines the number of seconds between
each save operation. For the local object store, this dictates
how often the contents of the object store is packaged and written
to disk. The PersistData virtual method is provided for subclasses
to provide a different implementation of how/where to persist the
data. The change to be in-memory aims at keeping the object store
performant with reasonable persistence guarantees.
Invoking FileInfo.Length when the file does not exist on disk throws an
error. Additionally, it was clunky to use the object store when always
required to perform ContainsKey first.
The job packet is now able to specify a maximum file count as well
as a maximum storage size limit. These controls are enforced in the
LocalObjectStorage implementation.
This commit is squashed from iterative development:
- More consistent method naming
- Storage root path updated to be absolute and include algorithm name
- Storage root path created only if object store is actually used
- Implemented XML save/load
- Added missing unit tests
- Replaced Log.Trace with Log.Error calls
- Added the object store name logging in Engine.Main
- Read storage root from config
- Create algorithm storage root folder in Initialize
- Remove empty folder in Dispose
- Added null checks in all methods
- Added missing XML parameter docs
- make Initialize and Dispose virtual
- make AlgorithmStorageRoot protected
The IObjectStore abstraction provides algorithms with a persistent
storage mechanism. While the algorithm is running, data is maintained
in memory as a dictionary of raw bytes (string -> byte[]). This ensures
we avoid any reference type shenanigans. Periodically, the data in the
object store is persisted and additionally, when the algorithm shuts
down, the object store's data will again be persisted. This ensures that
when the algorithm starts up again, it will have access to any state
that has been saved into the object store.
A great use case for IObjectStore is saving a compute heavy model.
For example, computing the weights of a deep neural network is very
CPU intensive, but after the weights are computed, evaluation is fairly
quick. An initial backtest can be used to solved for the network's weights
and then subsequent backtests or even in live mode, the weights will be
available to the algorithm provided they were saved into the object store.
Also, some libraries require a file path to load model data. The object
store provides a `GetFilePath(key)` method which will copy the data for
the provided key to the disk and return that path so the library can load
the model data.
- `AlgorithmManager` will search for `SubscriptionDataConfigs` using the
`SubcriptionDataConfigService` versus directly checking active `Subscriptions`.
In the case of warmup, subscription have not been added yet. Also will
include internal subscriptions.
- Adding unit tests
in the previous solution, I basically tried to pass the first row as a split of 1.
That's is not a correct solution, the correct one is treat the first row as a different _kind_ of row. In fact, the the main usage of the first row is a date reference for the factors defined in the second row
- `UserDefinedUniverse` will no longer be removed as a data subscription.
- When `algorithm.AddData()` is called a universe selection data point
will be added to the `UserDefinedUniverse` subscription to trigger
selection and add the requested data.
- `DataManager` will make sure an active subscriptions
`SubscriptionDataConfig` will be present in the configuration collection
- Adding unit and regression tests
If any request returns `{'success': False}` or throws an exception, `Api.Execute` will not exit the execution, will log and return the result. This change will enable re-try logic.
- Refactor to simplify `quantconnect.api.Api` class. Adds `Execute` method that accepts a bolean to distinguish a `POST` from a `GET` request.
In the previous version, some methods were using the `params` argument instead of `data` in the `POST` case which threw an exception for big json objects.
- Adds option to save logs and backtest report to disk.
- Adds `Result` class to optionally convert the json objects into `pandas.DataFrame` when getting backtest or live results
- Subversion bump
Previously, if there is an `inf` or exponential notation in a factor file row, the FactorFile.Row method will add a new line with a date equal to the original row date less one day. This is a bug because the dates in factor files row should always be a trading day, the previous logic didn't assure that.
Even more, that new line is useless, without that line addition, the parser still parses the latest row with valid factors and use it as the first row.
This commit removes the addition of a new line. Also, updates the test cases, now the FactorFile minimum date will be the latest row without inf or exponential notation less one day.
This helper is a favorite of mine to use in models to handle OnSecuritiesChanged
event to maintain internal state in light of additions/removals. Sometimes, the
internal state objects implement IDisposable, and without this code, they would
never be disposed. A good usage of IDisposable here would be to remove any items
that are only used by that single security related internal state such as a consolidator
In `PeriodCountConsolidatorBase`, differentiate the type of the `PyObject` and create an `IPeriodSpecification` accordingly. If the `PyObject` is a C# `Func<DateTime, CalendarInfo>` or a convertable Python method, we create a `FuncPeriodSpecification` with it as a parameter. If the `PyObject` is a `datetime.timedelta`, we create a `TimeSpanPeriodSpecification` with it as a parameter after a conversion to `TimeSpan`.
- Adds example in `DataConsolidationAlgorithm`
- Adds unit test for `timedelta` overload.
- Adding `CancellationTokenSource` to fix a race condition where the worker could stop
after the consumer had already checked on him, see TriggerProducer,
leaving the consumer waiting for ever GH issue 3885. This cancellation token
works as a flag for this particular case waking up the consumer.
- Adding unit tests
- Renamed CanLookupSymbols to CanAdvanceTime
- Updated options/futures chain universe PredicateTimeProvider to use IDataQueueUniverseProvider.CanAdvanceTime
In the Coarse Universe Selection of the following algorithms
- ContingentClaimsAnalysisDefaultPredictionAlpha
- GreenblattMagicFormulaAlpha
- PriceGapMeanReversionAlpha
- SykesShortMicroCapAlpha
Universe.Unchanged is now used when the universe is not changed instead of saving a list of symbol and returning it.
Other minor refactoring.
- Subscription workers will stop after adding 50 data points in the
first loop. Note that if consumer needs more data points it will trigger
a new worker
* Write files by `UpdatedAt` date instead of the `CreatedAt` date
* Added ability to append to compressed files if file does not exist
* Cleaned up code as per review
* misc. documentation changes
* RSS converter now lives at: https://gist.github.com/gsalaz98/b87992cd5a5a214d01c63dbbefbbf12c
* Removed BenzingaNewsFactory as it is no longer needed
* Cleaned up code as per review to make this more maintainable
* Included new documentation
* Filter articles with no Symbols from being written
* General code improvements (duplication reduction, etc.)
- Fix bug in live mode custom universe,
`BaseDataCollectionAggregatorEnumerator` returning false was causing the
subscription to be removed completely
- Adding unit test
- Renaming `PreSelected` to `Constituents`
- Adding base `ConstituentsUniverse`
- Adding Py and C# regression algorithm
- Fixing bug in `UniverseSelection`, it wasn't removing pending to be
removed securities unless the universe selection changed
- Adding test data
- Add support for Symbol key access for pandas ix and iloc results
- Wrapp pdf merge, join, concat method results
- Wrapp pandas.concat method result with `Remapper`
- Adding unit tests
- Added missing check on the requested tick type (reducing number of requests with futures and options)
- Updated the error handler to signal request completion for any error
* Avoids boxing of time values (performance)
This fix addresses Martin's review and concerns that the EndTime would
be rounded down before being emitted. Local testing shows that EndTime
and _algorithm.Time are aligned when the data is included in the Slice
at HH:mm:15s.
Regarding the bug fix, previously when Clone was called, we would set
EndTime equal to EndTime, advancing the time by one minute and fifteen
seconds. This would make the time show up as HH:mm:30s and a minute into
the future.
Additional note: the only place where EndTime is being rounded down is in
AlgorithmManager in the method EndTimeIsNativeResolution which is only
used for Consolidators.
This is done to accurately model live trading. Because PsychSignal
takes approximately 10-15s to aggregate their data for the past minute,
we need to take into account the live delay. Otherwise, we'll have data
emitted before it was actually available.
* Make Benzinga News Converter more efficient (and cleaner!)
* Implement JSON converter for serialization/deserialization process
* Fix bug where duplicate symbols would be added to news.Symbols
* Modify algorithm strategies
* Cleaned up code inside BenzingaDataConverter
* Added more documentation to potentially confusing bits around the code
Previously, we would be storing Benzinga time in local time (i.e. with
about a 7hr offset from UTC since I'm in Pacific Time). But now, we
instruct Json.NET to serialize and deserialize dates as UTC.
Additionally, because the data resolution was not set to `Second`,
EndTime would be rounded down with `Time` to the closest minute. We
fixed that by specifying that the data is in `Second` resolution.
- Add `BaseData.Reader` implementation which consumes the `StreamReader`
directly, avoiding in between substrings and parsing improving
performance and reducing resource consumption
- Adding unit tests, including performance unit test showing a >50%
improvement
- Remove chart subscription logic. Will stream all chart updates if any
(wont stream empty updates)
- Only serialize properties which are not null
- Adding Chart and Series `IsEmtpy()` extension. Adding unit tests
* Fixes issue where tickers with no exchange would be skipped
* Fixes issue where tickers that were ETFs were not processed
* Added additional logging
* Removed code duplication
* Reworked Clone method to be more efficient
- Use `/usr/share/nltk_data` instead of `/root/nltk_data`.
- Adds test for NLTK.
- Tidies the root directory of the docker image
- Adds support to mlfinlab
- 10147 - OrderId <OrderId> that needs to be cancelled is not found.
- 10148 - OrderId <OrderId> that needs to be cancelled can not be cancelled.
- 10149 - Invalid order id: <OrderId>.
The error code 202 has been removed from the Warning codes, because it is a notification when an order is cancelled.
- Enable parallel workers for history requests, improving performance.
- Adding SubscriptionUtils to be used by backtesting through the FileSystemDataFeed and
SubscriptionHistoryProvider and live deployments which use the
SubscriptionHistoryProvider in the core
- Fix some timezone issues when using local
start/end time instead of UTC
- Adding missing `Enumerator.Dispose()` to be called by the worker
This change prevents the security from being initialized with a quote tick, causing the bid price and ask price to be equal (and potentially remain so -- as can happen with OTM options).
- EventProvider will receive start date during initialization, this will
be used by the `MappingEventProvider` to correctly set current mapped
symbol
- Adding unit test, updating existing regression test
- `SetHoldings` will take `OnMarketOpen` ordes into account when
determining order quantity
- Adding new regression test. Updating existing algorithms which
suffered of the issue
- Adding a performance improvement, will avoid margin and portfolio
calculations for MarketOnOpen orders that wont be able to fill
- `ConcurrentSet` will use a `OrderedDictionary` internally so that items
are ordered deterministically, respecting insertion order.
- Adding unit tests
The CreateSubscription method has been moved from the BrokerageHistoryProvider to the SynchronizingHistoryProvider base class, to enable reuse and avoid code duplication.
- Replacing `BaseData.AdjustResolution` for `DefaultResolution` and
`SupportedResolutions`
- Making `Resolution` nullable for `Algorithm.AddData` methods
- The `ISubscriptionDataConfigService` will set the default resolution
if none was provided and assert it is supported
- Fix bug with `PythonData` `IsSparseData` and `RequiresMapping`
resolution
- Adding `BaseData.AdjustResolution()` that should return a valid
resolution for the given data and security type.
This allows us to set a limitation which is useful to avoid invalid data
requests or unnecessary fill forward situations. The user will be
notified through a console message.
- Adding unit and regression test
- Updating example algorithms custom data resolution
- Some performance improvements. Wont change console color if
`SelectedOptimization` is defined
- `ITimeRules` are expected to yield time date in UTC, fixing `Noon`,
`Midnight` and `Every`
- `ScheduledUniverseSelectionModel` will use UTC time zone by default
since that is the default expected time zone `ITimeRule` provides
- Adding regression test
- Adding `IDataChannelProvider` that will be used to determine if
streaming should be used
- Adding `data-channel-provider` for `config.json`
- Adding unit tests
- Adding `Security.NullLeverage` value to determine when the
`SecurityInitializer` leverage should be used or not
- Adding regression algorithm which reproduces the issue
- Adds Python version of `TrainingInitializeRegressionAlgorithm`;
- Adds C# version of `TrainingExampleAlgorithm`;
- Removes `TrainingScheduledRegressionAlgorithm`.
- `SubscriptionDataReader` will check map file first data and adjust
start date based on it
- Adding unit test
- Reducing code duplication
- Setting up `HistoryProvider` event handling
- Adding `FreePortfolioValue` to be set after algorithm initialize based
on the `TotalPortfolioValue` and the `FreePortfolioValuePercentage`
- Updating regression tests
- Adding new regression test
- Adding check for minimum order value at `BuyingPowerModel`
Adds constructor overloads to `EqualWeightingPortfolioConstructionModel` (`EWPCM`) to allow different rebalancing definitions.
It is possible to define rebalancing period with `Resolution`, `TimeSpan` (`timedelta` for Python) or a `Func<DateTime, DateTime>` (`lambda x: x+timedelta(y)`). The last option lets the model use `Expiry` helper class with the members such as `EndOfWeek` and `EndOfMonth`.
- Adding `SecurityCacheProvider` this class allows for two different
`Security` to share the same data type cache through different instance
of `SecurityCache`. This is used to directly access custom data types
through their underlying in a peformant maner
- Some small improvements
- `DynamicSecurityData` will be a view into the `SecurityCache` instance
- Custom data which has an underlying will use the underlying
`SecurityCache` data type cache instance
- Refactors for `Security` and `SecurityCache` to avoid storing twice
the same data points in the data type cache
- Covers another level on inheritance of Market Data by using `Type.IsAssignableFrom`
- Caches the list of `MethodInfo` for custom data types to avoid redefining that list.
When custom data classes inherited from market data classes such as `TradeBar`, it created duplicate entries. Therefore, we need to exclude the common properties in the private field `PandasData._members`.
- Set `CanRunLocally => false` for training regression algorithms
- Reverting sorting changes at `BacktestingRealTimeHandler` due to
performance degradation of current scheduled event benchmark algorithm
- Add `ProbabilisticSharpeRatio` to `PortfolioStatistics`
- `Probabilistic Sharpe Ratio` will be added to the `RunTimeStatistics`
sent by the `ResultHandlers`
- Making `TradeBuilder.ClosedTrades` thread safe since its accessed by
the `ResultHandlers`
- Removing `:` from live runtime statistics
- Adding unit tests
Per review comments provided in #3743 regarding the
training/long-running scheduled events and the leaky
bucket algorithm. See the PR for more information.
Deducts the standard time step limit from the additional minutes.
We start counting minutes immediately, even before the standard limit
has been exceeded.
This change ensures that we're not confusing the user with the message
- Adds PyObject overload to `ScheduleManager.TrainingNow` and `ScheduleManager.Training`
- Adds `QCAlgorithm.Train` helper method
- Adds Python algorithm showing how to use the helper method.
In order to continue to provide debugging support in the QC cloud, the
scheduled events were moved from inside of a task to the algorithm's
main execution thread. This necesitated a different methodology for
managing timeouts. Instead of raising an exception when attempting to
request additional time when none is remaining, we're now simply allowing
the isolator's limit to be reached by virtue of not incrementing the
additional minutes in the time manager. This uncovered a bug in LEAN
engine where if the isolator terminates an algorithm, then the status
of the algorithm (on the algorithm manager instance) isn't properly
updated to indicate RuntimeError. This is in direct conflict with the
status update that is provided to the api, which is RuntimeError, so
this change remedies that issue as well. One of the regression algorithms
depends on this status value being properly flipped to RuntimeError in
the event that the isolator limit is reached.
See #3319
We restrict each algorithm time loop to a pre-determined amount of time.
Exceeding this limit will cause the algorithm to immediately terminate.
This quickly becomes an issue when considering users running trainable
models that have a long initialization period that exceeds the time loop
maximum.
This change provides a mechanism through which a long-running scheduled
event is permitted to keep running and is permitted to avoid the time loop
permitted by requesting additional time. Requests for additional time are
limited according to a leaky bucket implementation whose parameters are
set via the job's controls structure. The fundamental time unit for the
algorithm is a single minute.
Here's how it works. If a scheduled event takes longer than one full wall
clock second then a request is made to the leaky bucket for one more minute.
If the scheduled event continues to take more time, it will continue to
request additional minutes. Each requested minute will prevent the algorithm's
time loop check from terminating the algorithm. When the bucket is empty and
no more minutes are available to be requested, a TimeoutException is thrown
causing a cascade that ends in the algorithm's termination and status being
flipped to RuntimeError.
Additionally, this applies equally to ALL scheduled events. While some helpers
were added with the naming of Train and TrainNow to the ScheduleManager, these
methods don't do anything special and the infrastructure doesn't otherwise
flag them as different, so this feature becomes part of the core Scheduled
Event feature set.
Further, the live scheduled events were not touched and are still pending
further discussion regarding the value added by enforcing a time restriction
when simulation time and wall clock time are equivalent.
Fixes#3319
These are simple, easy to use convenience properties for creating short-term
scheduled events and produces a cleaner, easier reading syntax than using the
other various methods to produce the same result, for example:
Schedule.On(DateRules.Tomorrow, TimeRules.Noon, MyScheduledEventMethod);
Also fixes typo in FuncDateRule constructor parameter.
When adding a new ScheduledEvent with exactly one event time set
to the algorithm's current time, the SkipEventsUntil function would
skip the only event. It appears that this bug was 'fixed' via commit:
33a4db4559
This 'fix' is more of a bandaid and I'm forgoing undoing the bandaid
applied in ScheduleManager and ScheduledEventBuilder where we start
calculating the event schedule a full day before the algorithm's
current time in an effort to reduce additional potential risks from
creeping into this PR focused on adding support for long-running
training functions.
The inspiration for this implementation design comes directly from:
https://github.com/mxplusb/TokenBucket
Minor modifications were made, including a prominent behavior change
that prevents the Refill method from executing the initial refill
before any time has passed.
We'll use the leaky bucket algorithm to track consumption of CPU
resources by the soon-to-be-implemented training feature.
These are fairly generic and useful types that have zero dependencies.
We'll reuse the the ITimeProvider for the leaky bucket implementation
to permit for better unit testing as well as potentially alternate modes
of managing the time that the leaky bucket sees, such as simulation time.
Extracting this behavior into it's own class. We'll later extend
the functionality of the implementation to enable a training event
a mechanism for extending the current time loop maximum and/or for
flat out disabling it while the training is runnig and the leaky
bucket has capacity.
- Adding new `ConfidenceWeightedPortfolioConstructionModel` (C# / Py) that will
generate percent `Targets` based on the latest active `Insight` `Confidence` per
`Symbol`.
- Will ignore `Insights` that have no `Confidence`.(unit tested)
- If the sum of all the last active `Insight` per `Symbol` is bigger than 1, it
will factor down each target percent holdings proportionally so the sum is 1. (unit tested)
- Adding unit tests
- Adding a new regression test framework algorithm (C#/Py)
-**Note**: `ConfidenceWeightedPortfolioConstructionModel` inherits from the `InsightWeightingPortfolioConstructionModel`. Protect method `GetValue` was implemented in `IWPCM` to enable the choice of `Insight` member.
- `PortfolioTargetCollection` avoid calling `Count` on
ConcurrentDictionary directly -> has to take all locks
- `SecurityChanges` change Union for Concat since constructor will call
HashSet
- Make `DynamicSecurityData` hold lazy data objects
- `RegisteredSecurityDataTypesProvider` avoid looping over all
registered types, adding `TryGetType`
- `Security.Update()` will no call group by on data since this data is
already grouped by type. Adding `ContainsFillForwardData` will allows to
be lazy and not re loop through the data unless necessary
- `DefaultAlphaHandler` will use the `static`
`Enumerable.Empty<Insight>` instance when possible
- `SubscriptionSynchronizer` will be lazy to construct the
`universeData` dictionary which is not used in most of the times. Will
use `Count` vs `Any` -> `Count` is known by the dictionary
- For python algorithms `JobQueue` will respect `AlgorithmLocation`, was
using unexisting `"algorithm-path-python"`
1. Only emit quote ticks when both the price and size messages have been received
Market data for trades and quotes is received by the IB API callbacks as separate messages and price is always received before the size. Previously we were emitting ticks on both price and size messages.
2. Do not emit half-quotes
Even in a very liquid instruments, we were previously sending quote ticks with e.g. only bid price and size but zero ask price and size. For consistency with other IDataQueueHandler implementations we now always emit complete quote ticks (for cases where there is no bid price available such as far OTM options, we'll emit the half-quote with zero for bid price and size).
3. The tick Quantity field is now set only for Trade ticks
Previously it was also being set for Quote ticks.
- Adding check for `FillForwardEnumerator` not to emit twice the same
end time. Adding unit tests
- Fix issue with `FileLogHandlerTests` conflicting with existing logging
instance, started by `QuantBook`
* Add new fields to Smart Insider enums (EventType, ExecutionHolding)
* Reroutes Error variant to SatisfyStockVesting in ExecutionHolding
* Adds additional documentation clarifying unknown fields
* Adds unit tests for Intentions and Transactions
* Enables SEC test that was disabled
* Self review - Fix typo in SmartInsiderEventType
* Self-review: Add additional documentation to missing SmartInsiderEvent fields
- Only define `DEBUG` is `SelectedOptimization` is not defined
- `DynamicSecurityData` will Keep a cache of the generic types
- Avoid using so much linq at `Security.Update()`
- `Slice` will also keep a cache of the generic types
* Adds and removes various fields from enums to represent data accurately
* Renames various variables
* Changes algorithms to work with new changes
* SmartInsider transaction/intention docs updated
* Added new enum values to represent pieces of data
* Updates Smart Insider demo algorithm end date for presentation
purposes of the equity curve
* Adds liquidation logic to SEC and Smart Insider demo algorithms
- Creates `CustomUniverseSelectionModel` that mimics `QCAlgorithm.AddUniverse(String, Func<DateTime, IEnumerable<string>>)`
- Replaces `BaseETFUniverse` for `InceptionDateUniverseSelectionModel` that inherits from `CustomUniverseSelectionModel`
- ETF Basket USMs inherits from `InceptionDateUniverseSelectionModel`
- Add `TiingoNews.HistoricalCrawlOffset`, timespan to add for
backtesting
- Rename: remove `Data` from `TiingoNewsData` and rename `TiingoDailyData` to `TiingoPrice`
Adds `BaseETFUniverseSelectionModel` that handles the common universe selection logic for all ETF Basket.
Adds the following ETF Baskets:
- Energy
- Precious Metals
- S&P500 Sectors
- Technology
- US Treasuries
- Volatility
- Add Tiingo news data
- Add `IndexSubscriptionDataSourceReader` that will handle data source
which use and index file
- Add `BaseSubscriptionDataSourceReader` to avoid code duplication
- Adjustments at `LiveCustomDataSubscriptionEnumeratorFactory` so that
custom data in a collection format does not emit old data
- Adding `TiingoNewsJsonConverter`
- Adding unit tests
Adds IRegisteredSecurityDataTypesProvider to track all the data types
registered in the algorithm. Using this data, we can detect if it's
possible that we'll eventually have a property of a certain type name.
For example, consider I wish to use security.Data.TradeBar but we haven't
received any trade bars yet. Before this change a KeyNotFoundException
would be raised, but since we can determine that we expect to have trade
bars, we can detect this and return an empty list when we haven't received
any data yet. This also removes the need to constantly do a HasData<T>()
check before accessing the dynamic members.
Closes#3620
Revert some Brokerage Model changes to ensure that security types other than equities are handled. Deleted unnecessary files and sub-directory in the Lean/Brokerages directory.
Implements basic AS brokerage model, slippage model, and updates the fee model to be sufficient for the competition and provide a skeleton for future additions.
Provides dynamic access to cached security data keyed by the type's name.
For example, `security.Data.GetAll<Tick>()` would yield a list of ticks.
Likewise, using the dynamic accessors, `((dynamic)security.Data).Tick`
would return the same list. In C# you'll need to cast security.Data to
a dynamic. In python, all C# objects are viewed as dynamic, so python can
simply access `security.Data.Tick` directly.
See #3620
Custom derivative data is now being saved into the underlying security's
cache. This makes the custom derivative data available via the underlying's
security object via underlying.Cache.Get<T> where T is the custom data type.
I noticed some deltas in regression statistics after a change. Before this
change, we log all orders into an {AlgorithmName}.{Language}.orders.log.
This information turned out to be insufficient to identify the regression.
This change also adds daily logging of portfolio value and each security's
holding quantity/value, as well as the full cash book. This should help to
more quickly identify the root cause behind system regression test failures.
RoundToSignificantDigits is implemented both for doubles and decimal
types, so there's no need to cast this to a double here and can only
introduce rounding errors. Now, it's unlikely that those rounding
errors would show themselves in the first 7 digits, unless of course
it's a serial case such as 0.1 which is notoriously non-representable
as a binary floating point number.
Looks like some copy/pasta when initially implementing invariant culture
enforcement. This change simply removes an extra Invariant(...) call
since the one line has them doubled up.
All of these types were relying on BaseData.ToString() and therefore
were only providing minimal amounts of data. I'm using these to improve
our ability to diff regression runs to help identify why regression
statistics may have changed.
- `SubscriptionSynchronizer` will call `OnSubscriptionFinished` for
subscriptions that added a data point to the packet after creating the
`TimeSlice` to avoid dropping the last data point.
Updates CollectionSubscriptionDataSourceReader to raise InvalidSource event
when a created reader is empty. The most common case for this to happen is
when the file is not found.
See #3618
Moves the FNF logging out of the DefaultDataProvider and into the
CreateStreamReaderError event handler. This move was required since
we don't have the required data in scope to perform this conditional
logging.
See #3618
Updates usages of GetBaseDataInstance that have a config in scope to use this
new method. This method properly assigns the symbol to the newly created base
data instance. This enables some of the new flag methods to work as expected.
Updated other poor usages where we weren't setting the symbol property.
See #3618
Flags a particular data type as having a sparse data set, meaning,
we don't expect it to have data for every single day. We'll use
this to prevent spamming log messages
All changes in this commit were based on the actual data.
For early closes, check the previous date last available data to define the close time. Also checked for boxing day for years were the New year eve was on weekend days.
For holidays dates, I checked the raw data and our processed data for observations. And in the cases I found ticks, I consider it was *not* a holiday. Worth noticing that in many cases the ticks start later.
It is somehow expected in FX markets, givens its global market hours. E.g., Sydney opens its markets 20xx-01-20 at 9:30 in a good part of the world it is still January 1st.
Although the Algorithm.FSharp project is excluded from the default Lean build, the reference to StringInterpolationBridgeStrong is required to build the project.
* Added better documentation for AddData methods
* Added new regression algorithms for adding in OnSecuritiesChanged
* Changed regression algorithms to add data that exists
* Styling and logging fixes
* Deleted regression algorithms because they tested behavior similar to
other existing regression algorithms
* Fixed new bug in regression algorithm due to AddData changes
* Added unit tests for wrapt version and package existence
* Fix issue where data would be set to raw normalization mode
* Implement method precedence hack on two additional python methods
* Changed behavior of SymbolCache loading with a ticker
- Previously, we would search the cache with ticker for types that did not
require mapping.
* Added new documentation
* Code cleanup
* Change first date to be SID.DefaultDate if SecurityType isn't expected
* Modified AddData tests to meet expectations. AddData unit tests are passing
* Added new method for Symbol to allow creation with underlying
* Added new unit tests
* Alter SecurityIdentifier method signature for BaseData
* AddData changes to accept underlying Symbol
* Added AddDataImpl
- Moving mapper from C# to Python since some cases did not work when
implemented in C#
- Small changes to `PandasDataFrameHistoryAlgorithm` which runs till the
end with no errors
- Adding more backwards compatible unit tests
- Add Pandas backwards compatibility shim
- Adding `MappingExtensions` which will remove data type from the
`Symbol.ID.Symbol` value to resolve the `MapFile`
- `SecurityIdentifier.TryParse()` will throw when given an invalid
`SecurityType`
This is being done in an effort to prevent symbol collisions within the
custom data (SecurityType.Base) namespace. The custom data type's name,
is used for disambiguation. As written, this change will break several
user algorithms that still rely on using the implicit string -> Symbol
lift. Providing this type information is optional an currently only being
used by AddData<T> methods. Other consumers of SecurityType.Base symbols
arn't at risk for collision, such as the UserDefinedUniverse, ScheduledUniverse
and others that are LEAN controlled. In order to maintain backwards compatibility,
the SymbolCache was updated to do a hard search when the requested ticker was
not found, looking for the prefix ('ticker.').
Fixes#3332
Moves the map file provider instance to a static readonly Lazy to ensure
single initialization and guarantee thread-safety without requiring an
additional locking mechanism. This change also simplifies the resolution
of the first ticker/date into it's own function vs embedding the logic
in the existing GenerateWithFirstDate method.
Many places in the code were updated from using Convert.ToXXX to ConvertInvariant<T>.
The existing ConvertInvariant<T> implementation would throw an error when attempting
to convert a null value into a value type, such as null -> decimal. This change
preserves the null conversion behavior by delegating to the appropriate Convert.ToXXX
method under the hood.
Fixes#3584
Sets `GetMaximumOrderQuantityForTargetValueResult.IsError` to false to silence the logging for order quantity rounding to zero in `BuyingPowerModel.GetMaximumOrderQuantityForTargetValue`.
1) Additional warning on end time not a full round of history request resolution
2) Logic to drop the last candle that start at end TimeKeeper
3) Test now time is rounded to resolution
4) minute resolution test case period reduced to 1 min.
This will cause the build to fail if overloads are available that accept
an IFormatProvider/CultureInfo that are not used. In most cases, simply
invoking the appropriate method in `StringExtensions` will do the trick.
For parsing, such `Parse<Type>Invariant` and associated methods.
For ToString-ing, use `ToStringInvariant` and `ToStringInvariant(format)`
Other culture-specific string methods are also provided, including
`StartsWithInvariant`, `EndsWithInvariant` and `IndexOfInvariant`. We
can continue to add methods to `StringExtensions` as new cases arise.
See #3045
Updates all occurrences of parsing/ToString-ing to use CultureInfo if
possible. This project can not reference QuantConnect.Common, so it's
unable to use the new StringExtensions.
See #3045
Updates all occurrences of parsing/ToString-ing to use CultureInfo if
possible. This project can not reference QuantConnect.Common, so it's
unable to use the new StringExtensions.
See #3045
We assumed CoinAPi symbol list has no duplicates, but they those cases are expected, as we can read in their docs https://docs.coinapi.io/#list-all-symbols
> In the unlikely event when the "symbol_id" for more than one market is the same. We will append the additional term (prefixed with the "_") at the end of the duplicated identifiers to differentiate them.
Ignore files if they have the extra suffix
After asking CoinApi support, they inform us that symbols with the extra suffix are for edge cases.
> There could be corner-cases for other data types like FUTURES or OPTIONS, e.g., exchange lists same option but one with American and other with European execution-style
- Adding new `SubscriptionDataEnumerator` to fix broken enumerator Dipose chain
- Adding `Dispose` calls on enumerators in unit tests
- Fix unit test HandlesCoarseFundamentalData
- Custom data types will know whether or not Lean should use map files
- Updating regression test with sample custom data using map files,
which can run locally
- Adding unit tests for the `SubscriptionDataReaderHistoryProvider`,
checking it mappes equities and options correctly
- added config setting for detailed WebSocketSharp logging
- added caller method name in WebSocketSharp log output
- added additional information in WebSocket Close events
Adds static methods of the form Parse.<TypeName>(string str) that use
CultureInfo.InvariantCulture. These are to be used when parsing strings.
It's still safe (from the CA1304/CA1305 perspective) to use the ToDecimal
extension method for decimals.
Adds string extension methods for common operations that will now require
CultureInfo.InvariantCulture. These are to be used when converting values
to strings, such as ToStringInvariant()/ToStringInvariant(format), but also
useful for searching within strings, StartsWithInvariant, EndsWithInvariant
and IndexOfInvariant.
FxCop has various rulesets for enforcing things within our codebase.
For this particular issue, we'll be enforcing CA1304 and CA1305 to
ensure we're always using an IFormatProvider or a CultureInfo where
applicable.
Linked Issue: #3045
The `NullReferenceException` type is intended to only be thrown by the CLR.
In most cases, it should be converted to an `ArgumentException` or an
`InvalidOperationException`, depending on if the null value is a parameter
to the current method or not.
The `Exception` type should never really be thrown as it doesn't provide any
additional information or hints as to the issue. It also forces users that
would like to handle expected exceptions to catch all exceptions. These are
converted to an exception type that more accurately describes the reason for
raising the exception: `KeyNotFoundException`, `InvalidOperationException`
- `SubscriptionFrontierTimeProvider` will advance new subscriptions to find the current emit time of the enumerator. This will make enumeration more deterministic, since we will not skip any time step of the new enumerators.
- Adding a just in case flag to the backtesting `Synchronizer` to retry creating a new time slice when the time has not advanced but there is data in the slice, this could happen with subscriptions added after initialize using algorithm.AddSecurity() API, where the subscription start time is the current time loop (but should just happen once)
The engine will no longer try to look for fine fundamental files that does not exist.
Notifies the user that the algorithm should be handle the fine fundamental data filtering.
When there is no recorded portfolio equity for a given period, we don't need to compute the algorithm performance (it is all zeroes). This procedure prevents the unrepresented `DateTime` error in `CreateBenchmarkDifferences`.
- `SubscriptionSynchronizer` will emit a `TimeSlice.TimePulse` before
performing any universe selection on each time loop. This will advance
`Algorithm.Time` which will allow universe selection data time and
`Algorithm.Time` to be aligned.
- Updating Regression algorithms that were using `algorithm.Time` in the
selection method.
- Coarse selection will start from the algorithms start date (not in the
next day)
- Adding regression algorithm
- Setting SPY as the default security benchmark
- The security benchmark subscription will be added at `UniverseSelection`
as an internal subscription. Using its own dedicated Security instance
which doesn't live in the algorithms.Securities collection.
- Reducing algorithms exposure to internal subscriptions
- `TimeSliceFactory` will prioritize higher resolution bars, when same
symbol is present twice (for non-internal subscriptionst)
- Adding regression test `CustomUniverseWithBenchmarkRegressionAlgorithm`
- Reducing the filesystem upper limit data count for coarse file to 1GB
of ram. Currently in master as is, for long backtests, will load all
data, which is roughly 9GB of ram
- For backtesting, adding new private static fine cache that will hold the available dates
for which there is a fine file, reducing the amount of times we
enumerate the files in the directory. Updating unit tests.
- Will use an array as store, the most efficient collection memory wise.
Holding a custom internal `struct PeriodField`
- Will store period as a byte number, not a string
- By default the collection will be null and created on demand
- Adding more unit tests
The current setting of Encoding.Unicode was preventing LEAN from receiving IBAutomater output and error events when running local LEAN under Windows.
Changing to Encoding.UTF8 solves the issue.
Discrepencies were being noted between the plotted asset holdings and the actual holdings weights. Additionally, the plot was changing between HTML renderings using the same backtest data.
- We will now check if python selection method returned `Universe.Unchanged`
- Removing `ToList()` call on fine and coarse data before sending it to
the python algorithm
- Adding regression algorithms
Updated the monthly returns matrix text labeling. Force the top row of labels to align bottom and the bottom row of labels to align top. As originally done, the local report created was fine but the report in the cloud still have overlapping label/axis issues. This fixes that problem.
Add try-except controls to ensure successful report creation when backtest result is incomplete. Prevents issues popping up with None types for Total Performance and Portfolio Statistics.
A new IBrokerageCashSynchronizer interface has been added to allow brokerage implementations to handle the cash sync process. Most of the cash sync logic has been moved from the BrokerageTransactionHandler to the base Brokerage class.
The IB brokerage will not attempt to perform cash sync during server reset times, respecting the different regional schedules (see also #3413).
Adds `DuplicateKeyPortfolioConstructionModelTests` to evaluate duplicate keys exception on `ReturnsSymbolData.FormReturnsMatrix` method.
Removes `DuplicateKeyExceptionAlgorithm` as it is replaced by the unit test.
Fix issue in Consensus downloader where attempt to select non-duplicates would cause a chunk
of data to be dropped and never processed
Update output path to be CSV files instead of JSON, non-zipped
Fix documentation
Fix bug where we would not properly output files
- New `KellyCriterionManager` will be used by the
`StatisticsInsightManagerExtension` to calculate and update the
`AlphaRuntimeStatistics` with the new Kelly Criterion values, on a daily
basis.
In order to provide full Lean Indicator functionality to python custom indicators, they need to inherit from a C# class. `PythonIndicator` will serve for this purpose.
Algorithms can use the former version (no inheritance).
<h1>Develop Algorithms Locally, Run in Container</h1>
We have set up a relatively easy way to develop algorithms in your local IDE and push them into the container to be run and debugged.
Before we can use this method with Windows or Mac OS we need to share the Lean directory with Docker.
<br />
<h2>Activate File Sharing for Docker:</h2>
* Windows:
* [Guide to sharing](https://docs.docker.com/docker-for-windows/#file-sharing)
* Share the LEAN root directory with docker
* Mac:
* [Guide to sharing](https://docs.docker.com/docker-for-mac/#file-sharing)
* Share the LEAN root directory with docker
* Linux:
* (No setup required)
<br />
<h2>Lean Configuration</h2>
Next we need to be sure that our Lean configuration at **.\Launcher\config.json** is properly set. Just like running lean locally the config must reflect what we want Lean to run.
You configuration file should look something like this:
Our specific configuration binds the Algorithm.Python directory to the container by default so any algorithm you would like to run should be in that directory. Please ensure your algorithm location looks just the same as the example above. If you want to use a different location refer to the section bellow on setting that argument for the container and make sure your config.json also reflects this.
<br />
<h2>Running Lean in the Container</h2>
This section will cover how to actually launch Lean in the container with your desired configuration.
From a terminal; Pycharm has a built in terminal on the bottom taskbar labeled **Terminal**; launch the run_docker.bat/.sh script; there are a few choices on how to launch this:
1. Launch with no parameters and answer the questions regarding configuration (Press enter for defaults)
* Would you like to debug C#? (Requires mono debugger attachment) [default: N]:
2. Using the **run_docker.cfg** to store args for repeated use; any blank entries will resort to default values! example: **_./run_docker.bat run_docker.cfg_**
IMAGE=quantconnect/lean:latest
CONFIG_FILE=
DATA_DIR=
RESULTS_DIR=
DEBUGGING=
PYTHON_DIR=
3. Inline arguments; anything you don't enter will use the default args! example: **_./run_docker.bat DEBUGGING=y_**
* Accepted args for inline include all listed in the file in #2; must follow the **key=value** format
<br />
<h1>Debugging Python</h1>
Debugging your Python algorithms requires an extra step within your configuration and inside of PyCharm. Thankfully we were able to configure the PyCharm launch configurations to take care of most of the work for you!
<br />
<h2>Modifying the Configuration</h2>
First in order to debug a Python algorithm in Pycharm we must make the following change to our configuration (Launcher\config.json) under the comment debugging configuration:
"debugging": true,
"debugging-method": "PyCharm",
In setting this we are telling Lean to reach out and create a debugger connection using PyCharm’s PyDevd debugger server. Once this is set Lean will **always** attempt to connect to a debugger server on launch. **If you are no longer debugging set “debugging” to false.**
<br />
<h2>Using PyCharm Launch Options</h2>
Now that Lean is configured for the debugger we can make use of the programmed launch options to connect.
**<h3>Container (Recommended)</h3>**
To debug inside of the container we must first start the debugger server in Pycharm, to do this use the drop down configuration “Debug in Container” and launch the debugger. Be sure to set some breakpoints in your algorithms!
Then we will need to launch the container, follow the steps described in the section “[Running Lean in the Container](#Running-Lean-in-the-Container)”. After launching the container the debugging configuration will take effect and it will connect to the debug server where you can begin debugging your algorithm.
**<h3>Local</h3>**
To debug locally we must run the program locally. First, just as the container setup, start the PyCharm debugger server by running the “Debug Local” configuration.
Then start the program locally by whatever means you typically use, such as Mono, directly running the program at **QuantConnect.Lean.Launcher.exe**, etc. Once the program is running it will make the connection to your PyCharm debugger server where you can begin debugging your algorithm.
<h1>Local Development & Docker Integration with Visual Studio</h1>
This document contains information regarding ways to use Visual Studio to work with the Lean's Docker image.
<br />
<h1>Getting Setup</h1>
Before anything we need to ensure a few things have been done:
1. Get [Visual Studio](https://code.visualstudio.com/download)
* Get the Extension [VSMonoDebugger](https://marketplace.visualstudio.com/items?itemName=GordianDotNet.VSMonoDebugger0d62) for C# Debugging
2. Get [Docker](https://docs.docker.com/get-docker/):
* Follow the instructions for your Operating System
* New to Docker? Try docker getting-started
3. Pull Lean’s latest image from a terminal
*_docker pull quantconnect/lean_
4. Get Lean into Visual Studio
* Download the repo or clone it using: _git clone[ https://github.com/QuantConnect/Lean](https://github.com/QuantConnect/Lean)_
* Open the solution **QuantConnect.Lean.sln** using Visual Studio
<br />
<h1>Develop Algorithms Locally, Run in Container</h1>
We have set up a relatively easy way to develop algorithms in your local IDE and push them into the container to be run and debugged.
Before we can use this method with Windows or Mac OS we need to share the Lean directory with Docker.
<br />
<h2>Activate File Sharing for Docker:</h2>
* Windows:
* [Guide to sharing](https://docs.docker.com/docker-for-windows/#file-sharing)
* Share the LEAN root directory with docker
* Mac:
* [Guide to sharing](https://docs.docker.com/docker-for-mac/#file-sharing)
* Share the LEAN root directory with docker
* Linux:
* (No setup required)
<br />
<h2>Lean Configuration</h2>
Next we need to be sure that our Lean configuration at **.\Launcher\config.json** is properly set. Just like running lean locally the config must reflect what we want Lean to run.
You configuration file should look something like this for the following languages:
In order to use a custom C# algorithm, the C# file must be compiled before running in the docker, as it is compiled into the file **"QuantConnect.Algorithm.CSharp.dll"**. Any new C# files will need to be added to the csproj compile list before it will compile, check **Algorithm.CSharp/QuantConnect.Algorithm.CSharp.csproj** for all algorithms that are compiled. Once there is an entry for your algorithm the project can be compiled by using **Build > Build Solution**.
If you would like to debug this file in the docker container one small change to the solutions target build is required.
1. Right click on the solution **QuantConnect.Lean** in the _Solution Explorer_
2. Select **Properties**
3. For project entry **QuantConnect.Algorithm.CSharp** change the configuration to **DebugDocker**
4. Select **Apply** and close out of the window.
5. Build the project at least once before running the docker.
<br />
<h2>Running Lean in the Container</h2>
This section will cover how to actually launch Lean in the container with your desired configuration.
From a terminal launch the run_docker.bat/.sh script; there are a few choices on how to launch this:
1. Launch with no parameters and answer the questions regarding configuration (Press enter for defaults)
* Would you like to debug C#? (Requires mono debugger attachment) [default: N]:
2. Using the **run_docker.cfg** to store args for repeated use; any blank entries will resort to default values! example: **_./run_docker.bat run_docker.cfg_**
IMAGE=quantconnect/lean:latest
CONFIG_FILE=
DATA_DIR=
RESULTS_DIR=
DEBUGGING=
PYTHON_DIR=
3. Inline arguments; anything you don't enter will use the default args! example: **_./run_docker.bat DEBUGGING=y_**
* Accepted args for inline include all listed in the file in #2
<br />
<h1>Connecting to Mono Debugger</h1>
If you launch the script with debugging set to **yes** (y), then you will need to connect to the debugging server with the mono extension that you installed in the setup stage.
To setup the extension do the following:
* Go to **Extensions > Mono > Settings...**
* Enter the following for the settings:
* Remote Host IP: 127.0.0.1
* Remote Host Port: 55555
* Mono Debug Port: 55555
* Click **Save** and then close the extension settings
Now that the extension is setup use it to connect to the Docker container by using:
***Extensions > Mono > Attach to mono debugger**
The program should then launch and trigger any breakpoints you have set in your C# Algorithm.
<h1>Local Development & Docker Integration with Visual Studio Code</h1>
This document contains information regarding ways to use Visual Studio Code to work with the Lean engine, this includes using Lean’s Docker image in conjunction with local development as well as running Lean locally.
<br />
<h1>Getting Setup</h1>
Before anything we need to ensure a few things have been done:
1. Get [Visual Studio Code](https://code.visualstudio.com/download)
* Get the Extension [Mono Debug **15.8**](https://marketplace.visualstudio.com/items?itemName=ms-vscode.mono-debug) for C# Debugging
* Get the Extension [Python](https://marketplace.visualstudio.com/items?itemName=ms-python.python) for Python Debugging
2. Get [Docker](https://docs.docker.com/get-docker/):
* Follow the instructions for your Operating System
* New to Docker? Try docker getting-started
3. Install a compiler for the project **(Only needed for C# Debugging or Running Locally)**
* Visual Studio comes packed with msbuild or download without VS [here](https://visualstudio.microsoft.com/downloads/?q=build+tools)
* Put msbuild on your system path and test with command: _msbuild -version_
4. Pull Lean’s latest image from a terminal
*_docker pull quantconnect/lean_
5. Get Lean into VS Code
* Download the repo or clone it using: _git clone[ https://github.com/QuantConnect/Lean](https://github.com/QuantConnect/Lean)_
* Open the folder using VS Code
**NOTES**:
- Mono Extension Version 16 and greater fails to debug the docker container remotely, please install **Version 15.8**. To install an older version from within VS Code go to the extensions tab, search "Mono Debug", and select "Install Another Version...".
<br />
<h1>Develop Algorithms Locally, Run in Container</h1>
We have set up a relatively easy way to develop algorithms in your local IDE and push them into the container to be run and debugged.
Before we can use this method with Windows or Mac OS we need to share the Lean directory with Docker.
<br />
<h2>Activate File Sharing for Docker:</h2>
* Windows:
* [Guide to sharing](https://docs.docker.com/docker-for-windows/#file-sharing)
* Share the LEAN root directory with docker
* Mac:
* [Guide to sharing](https://docs.docker.com/docker-for-mac/#file-sharing)
* Share the LEAN root directory with docker
* Linux:
* (No setup required)
<br />
<h2>Lean Configuration</h2>
Next we need to be sure that our Lean configuration at **.\Launcher\config.json** is properly set. Just like running lean locally the config must reflect what we want Lean to run.
You configuration file should look something like this for the following languages:
In order to use a custom C# algorithm, the C# file must be compiled before running in the docker, as it is compiled into the file "QuantConnect.Algorithm.CSharp.dll". Any new C# files will need to be added to the csproj compile list before it will compile, check Algorithm.CSharp/QuantConnect.Algorithm.CSharp.csproj for all algorithms that are compiled. Once there is an entry for your algorithm the project can be compiled by using the “build” task under _“Terminal” > “Run Build Task”._
Python **does not** have this requirement as the engine will compile it on the fly.
<br />
<h2>Running Lean in the Container</h2>
This section will cover how to actually launch Lean in the container with your desired configuration.
<br />
<h3>Option 1 (Recommended)</h3>
In VS Code click on the debug/run icon on the left toolbar, at the top you should see a drop down menu with launch options, be sure to select **Debug in Container**. This option will kick off a launch script that will start the docker. With this specific launch option the parameters are already configured in VS Codes **tasks.json** under the **run-docker** task args. These set arguments are:
As defaults these are all great! Feel free to change them as needed for your setup.
**NOTE:** VSCode may try and throw errors when launching this way regarding build on `QuantConnect.csx` and `Config.json` these errors can be ignored by selecting "*Debug Anyway*". To stop this error message in the future select "*Remember my choice in user settings*".
If using C# algorithms ensure that msbuild can build them successfully.
<br />
<h3>Option 2</h3>
From a terminal launch the run_docker.bat/.sh script; there are a few choices on how to launch this:
1. Launch with no parameters and answer the questions regarding configuration (Press enter for defaults)
* Would you like to debug C#? (Requires mono debugger attachment) [default: N]:
2. Using the **run_docker.cfg** to store args for repeated use; any blank entries will resort to default values! example: **_./run_docker.bat run_docker.cfg_**
IMAGE=quantconnect/lean:latest
CONFIG_FILE=
DATA_DIR=
RESULTS_DIR=
DEBUGGING=
PYTHON_DIR=
3. Inline arguments; anything you don't enter will use the default args! example: **_./run_docker.bat DEBUGGING=y_**
* Accepted args for inline include all listed in the file in #2
<br />
<h1>Debugging Python</h1>
Python algorithms require a little extra work in order to be able to debug them locally or in the container. Thankfully we were able to configure VS code tasks to take care of the work for you! Follow the steps below to get Python debugging working.
<br />
<h2>Modifying the Configuration</h2>
First in order to debug a Python algorithm in VS Code we must make the following change to our configuration (Launcher\config.json) under the comment debugging configuration:
"debugging": true,
"debugging-method": "PTVSD",
In setting this we are telling Lean to expect a debugger connection using ‘Python Tools for Visual Studio Debugger’. Once this is set Lean will stop upon initialization and await a connection to the debugger via port 5678.
<br />
<h2>Using VS Code Launch Options to Connect</h2>
Now that Lean is configured for the python debugger we can make use of the programmed launch options to connect.
<br />
<h3>Container</h3>
To debug inside of the container we must first start the container, follow the steps described in the section “[Running Lean in the Container](#Running-Lean-in-the-Container)”. Once the container is started you should see the messages in Figure 2.
If the message is displayed, use the same drop down for “Debug in Container” and select “Attach to Python (Container)”. Then press run, VS Code will now enter and debug any breakpoints you have set in your Python algorithm.
<br />
<h3>Local</h3>
To debug locally we must run the program locally using the programmed task found under Terminal > Run Task > “Run Application”. Once Lean is started you should see the messages in Figure 2.
If the message is displayed, use the launch option “Attach to Python (Local)”. Then press run, VS Code will now enter and debug any breakpoints you have set in your python algorithm.
20200715 17:12:06.548 Trace:: DebuggerHelper.Initialize(): waiting for debugger to attach at localhost:5678...
```
<br />
<h1>Common Issues</h1>
Here we will cover some common issues with setting this up. This section will expand as we get user feedback!
* Any error messages about building in VSCode that point to comments in JSON. Either select **ignore** or follow steps described [here](https://stackoverflow.com/questions/47834825/in-vs-code-disable-error-comments-are-not-permitted-in-json) to remove the errors entirely.
*`Errors exist after running preLaunchTask 'run-docker'`This VSCode error appears to warn you of CSharp errors when trying to use `Debug in Container` select "Debug Anyway" as the errors are false flags for JSON comments as well as `QuantConnect.csx` not finding references. Neither of these will impact your debugging.
* `The container name "/LeanEngine" is already in use by container "****"` This Docker error implies that another instance of lean is already running under the container name /LeanEngine. If this error appears either use Docker Desktop to delete the container or use `docker kill LeanEngine` from the command line.
thrownewException($"Underlying symbol for {customTwxSymbol} is not equal to TWX equity. Expected {twxEquity} got {customTwxSymbol.Underlying}");
}
if(customGooglSymbol.Underlying!=_googlEquity)
{
thrownewException($"Underlying symbol for {customGooglSymbol} is not equal to GOOGL equity. Expected {_googlEquity} got {customGooglSymbol.Underlying}");
}
if(usTreasury.HasUnderlying)
{
thrownewException($"US Treasury yield curve (no underlying) has underlying when it shouldn't. Found {usTreasury.Underlying}");
}
if(!usTreasuryUnderlying.HasUnderlying)
{
thrownewException("US Treasury yield curve (with underlying) has no underlying Symbol even though we added with Symbol");
thrownewException($"US Treasury yield curve underlying does not equal equity Symbol added. Expected {usTreasuryUnderlyingEquity} got {usTreasuryUnderlying.Underlying}");
}
if(customOptionSymbol.Underlying!=optionSymbol)
{
thrownewException("Option symbol not equal to custom underlying symbol. Expected {optionSymbol} got {customOptionSymbol.Underlying}");
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.