Compare commits

...

32 Commits

Author SHA1 Message Date
ValueRaider
c399feabe6 Version 0.2.56 2025-04-23 22:20:32 +01:00
ValueRaider
e0fdd5823f Merge pull request #2410 from ranaroussi/dev
sync dev -> main
2025-04-23 22:18:43 +01:00
ValueRaider
1ed7c39e94 Prices fix: improve handling empty data 2025-04-23 21:48:40 +01:00
ValueRaider
fee9af07ce Fix tests and a typo 2025-04-23 21:46:36 +01:00
ValueRaider
d64777c6e7 Add new config to RST docs 2025-04-23 19:15:35 +01:00
ValueRaider
cfc7142c56 Merge pull request #2391 from ranaroussi/feature/config
Config
2025-04-01 19:36:44 +01:00
ValueRaider
89f1934bd6 Config 1st version: just proxy 2025-03-31 19:06:43 +01:00
ValueRaider
07651ed2f4 Merge pull request #2389 from ranaroussi/fix/prices-live-combine
Fix fix_Yahoo_returning_live_separate()
2025-03-31 19:06:06 +01:00
ValueRaider
b1bb75113c Merge pull request #2364 from dhruvan2006/feature/lookup
Feature: Ticker lookups
2025-03-30 21:26:57 +01:00
ValueRaider
a358c8848f Merge pull request #2388 from JanMkl/fix-issue-2301
Fixes issues 2301 and 2383 AttributeError: module 'requests.cookies' has no attribute 'update'
2025-03-30 21:20:05 +01:00
ValueRaider
fbe9f8f119 Fix fix_Yahoo_returning_live_separate()
Was not handling live row on/just-after New Years.
Also fixed prepost interval being merged, incorrectly.
2025-03-30 20:38:11 +01:00
Jan Melen
00fec80f63 Merge branch 'dev' into fix-issue-2301 2025-03-30 09:02:59 +03:00
Jan Melen
81c6b2d2e6 Fixes issues 2301 and 2383 AttributeError
- In utils.py line 165 if the session is None then _requests is
  mistakenly passed to as session to data.py which causes attribute
  error: module 'requests.cookies' has no attribute 'update'
- The above condition happens only when searching tickers with ISIN
  numbers
- Added tests with ISIN numbers
- Added tickerBase to raise an ValueError if an empty tickername is
  passed or isin search results in to empty ticker. This is to have
  consistent behaviour with utils.get_all_by_isin()
- Corrected ruff check failures
- Rebasing to latest upstream/dev
- Updated the value error to contain the ISIN number that wasn't found
2025-03-26 23:57:00 +02:00
ValueRaider
29162bdd01 Merge pull request #2382 from JanMkl/fix-issue-2343
Fixes issue 2343 and 2363 Empty result and QuoteResponse
2025-03-26 21:03:02 +00:00
Jan Melen
587fdd032b Fixes issue 2343 and 2363 Empty result and QuoteResponse
- Check that if result from _fetch was None and either update the result
  with additional info or set the result as the additional info.
- Added test_empty_info test case to protect if the behavior in Yahoo
  changes.
- Fixed typo on line 1024
2025-03-26 06:26:28 +02:00
ValueRaider
8cc002ecbf Merge pull request #2378 from ranaroussi/fix/prices-end
Fix converting end epoch to localized dt
2025-03-23 16:08:16 +00:00
ValueRaider
092a0c8193 Fix converting end epoch to localized dt 2025-03-23 16:07:22 +00:00
Dhruvan Gnanadhandayuthapani
8908724f3c Add Lookup module documentation and examples 2025-03-23 15:06:17 +01:00
Ran Aroussi
4ecd6a67cf Updated link 2025-03-22 13:21:44 +00:00
ValueRaider
f3f6739153 README: fix docs link 2025-03-22 11:09:55 +00:00
ValueRaider
9c9f305b0a Merge pull request #2375 from ranaroussi/main
sync main -> dev
2025-03-20 20:44:18 +00:00
ValueRaider
509a33b71d Merge pull request #2374 from ranaroussi/dev-documented
sync Sphinx docs -> dev
2025-03-20 20:42:20 +00:00
ValueRaider
6af17db77d Merge pull request #2373 from ranaroussi/dev
sync dev -> Sphinx docs
2025-03-20 20:39:07 +00:00
Dhruvan Gnanadhandayuthapani
3a7b802b20 Add unit tests for Lookup functionality 2025-03-16 03:07:57 +01:00
Dhruvan Gnanadhandayuthapani
287536ad15 Add Lookup class for Yahoo Finance ticker lookups 2025-03-16 02:57:27 +01:00
Ran Aroussi
f09bc07ecc Update conf.py with logo 2025-02-22 22:35:52 +00:00
Ran Aroussi
dba0cba00b yfinance logos 2025-02-22 22:34:35 +00:00
ValueRaider
90053ab7c3 Merge pull request #2265 from ranaroussi/dev
sync dev -> Sphinx docs
2025-02-15 18:14:33 +00:00
ValueRaider
92c3b9066a Merge pull request #2229 from ranaroussi/dev
sync dev -> Sphinx docs #3
2025-01-18 16:29:12 +00:00
ValueRaider
94327906df Merge pull request #2228 from ranaroussi/dev
sync dev -> Sphinx docs #2
2025-01-18 16:09:40 +00:00
ValueRaider
07474123a2 Merge pull request #2227 from ranaroussi/dev
sync dev -> Sphinx docs
2025-01-18 16:01:19 +00:00
ValueRaider
2fc0907217 Merge pull request #2170 from ranaroussi/dev
trigger doc update
2024-12-08 11:13:28 +00:00
37 changed files with 1174 additions and 436 deletions

View File

@@ -1,6 +1,17 @@
Change Log
===========
0.2.56
------
Features:
- Ticker lookups #2364
- Config #2391
Fixes:
- converting end epoch to localized dt #2378
- info IndexError #2382
- AttributeError: module 'requests.cookies' has no attribute 'update' #2388
- fix_Yahoo_returning_live_separate() #2389
0.2.55
------
Features

View File

@@ -30,7 +30,7 @@
> [!TIP]
> THE NEW DOCUMENTATION WEBSITE IS NOW LIVE! 🤘
>
> Visit [**yfinance-python.org**](https://yfinance-python.org/)
> Visit [**yfinance-python.org**](https://yfinance-python.org)
---

View File

@@ -0,0 +1,15 @@
******
Config
******
`yfinance` has a new global config for sharing common values.
Proxy
-----
Set proxy once in config, affects all yfinance data fetches.
.. code-block:: python
import yfinance as yf
yf.set_config(proxy="PROXY_SERVER")

View File

@@ -6,6 +6,6 @@ Advanced
:maxdepth: 2
logging
proxy
config
caching
multi_level_columns

View File

@@ -1,11 +0,0 @@
************
Proxy Server
************
You can download data via a proxy:
.. code-block:: python
msft = yf.Ticker("MSFT")
msft.history(..., proxy="PROXY_SERVER")

View File

@@ -0,0 +1,33 @@
import yfinance as yf
# Get All
all = yf.Lookup("AAPL").all
all = yf.Lookup("AAPL").get_all(count=100)
# Get Stocks
stock = yf.Lookup("AAPL").stock
stock = yf.Lookup("AAPL").get_stock(count=100)
# Get Mutual Funds
mutualfund = yf.Lookup("AAPL").mutualfund
mutualfund = yf.Lookup("AAPL").get_mutualfund(count=100)
# Get ETFs
etf = yf.Lookup("AAPL").etf
etf = yf.Lookup("AAPL").get_etf(count=100)
# Get Indices
index = yf.Lookup("AAPL").index
index = yf.Lookup("AAPL").get_index(count=100)
# Get Futures
future = yf.Lookup("AAPL").future
future = yf.Lookup("AAPL").get_future(count=100)
# Get Currencies
currency = yf.Lookup("AAPL").currency
currency = yf.Lookup("AAPL").get_currency(count=100)
# Get Cryptocurrencies
cryptocurrency = yf.Lookup("AAPL").cryptocurrency
cryptocurrency = yf.Lookup("AAPL").get_cryptocurrency(count=100)

View File

@@ -18,6 +18,7 @@ The following are the publicly available classes, and functions exposed by the `
- :attr:`Market <yfinance.Market>`: Class for accessing market summary.
- :attr:`download <yfinance.download>`: Function to download market data for multiple tickers.
- :attr:`Search <yfinance.Search>`: Class for accessing search results.
- :attr:`Lookup <yfinance.Lookup>`: Class for looking up tickers.
- :attr:`Sector <yfinance.Sector>`: Domain class for accessing sector information.
- :attr:`Industry <yfinance.Industry>`: Domain class for accessing industry information.
- :attr:`Market <yfinance.Market>`: Class for accessing market status & summary.
@@ -39,6 +40,7 @@ The following are the publicly available classes, and functions exposed by the `
yfinance.analysis
yfinance.market
yfinance.search
yfinance.lookup
yfinance.sector_industry
yfinance.screener
yfinance.functions

View File

@@ -1,5 +1,5 @@
=====================
Search & News
Search & Lookup
=====================
.. currentmodule:: yfinance
@@ -14,9 +14,21 @@ The `Search` module, allows you to access search data in a Pythonic way.
Search
Search Sample Code
The `Lookup` module, allows you to look up tickers in a Pythonic way.
.. autosummary::
:toctree: api/
Lookup
Sample Code
------------------
The `Search` module, allows you to access search data in a Pythonic way.
.. literalinclude:: examples/search.py
:language: python
The `Lookup` module, allows you to look up tickers in a Pythonic way.
.. literalinclude:: examples/lookup.py
:language: python

View File

@@ -1,5 +1,5 @@
{% set name = "yfinance" %}
{% set version = "0.2.55" %}
{% set version = "0.2.56" %}
package:
name: "{{ name|lower }}"
@@ -27,7 +27,6 @@ requirements:
- beautifulsoup4 >=4.11.1
- html5lib >=1.1
- peewee >=3.16.2
# - pycryptodome >=3.6.6
- pip
- python
@@ -43,7 +42,6 @@ requirements:
- beautifulsoup4 >=4.11.1
- html5lib >=1.1
- peewee >=3.16.2
# - pycryptodome >=3.6.6
- python
test:

View File

@@ -8,8 +8,6 @@ Specific test class:
python -m unittest tests.cache.TestCache
"""
from unittest import TestSuite
from tests.context import yfinance as yf
import unittest
@@ -48,46 +46,5 @@ class TestCache(unittest.TestCase):
self.assertTrue(os.path.exists(os.path.join(self.tempCacheDir.name, "tkr-tz.db")))
class TestCacheNoPermission(unittest.TestCase):
@classmethod
def setUpClass(cls):
if os.name == "nt": # Windows
cls.cache_path = "C:\\Windows\\System32\\yf-cache"
else: # Unix/Linux/MacOS
# Use a writable directory
cls.cache_path = "/yf-cache"
yf.set_tz_cache_location(cls.cache_path)
def test_tzCacheRootStore(self):
# Test that if cache path in read-only filesystem, no exception.
tkr = 'AMZN'
tz1 = "America/New_York"
# During attempt to store, will discover cannot write
yf.cache.get_tz_cache().store(tkr, tz1)
# Handling the store failure replaces cache with a dummy
cache = yf.cache.get_tz_cache()
self.assertTrue(cache.dummy)
cache.store(tkr, tz1)
def test_tzCacheRootLookup(self):
# Test that if cache path in read-only filesystem, no exception.
tkr = 'AMZN'
# During attempt to lookup, will discover cannot write
yf.cache.get_tz_cache().lookup(tkr)
# Handling the lookup failure replaces cache with a dummy
cache = yf.cache.get_tz_cache()
self.assertTrue(cache.dummy)
cache.lookup(tkr)
def suite():
ts: TestSuite = unittest.TestSuite()
ts.addTest(TestCache('Test cache'))
ts.addTest(TestCacheNoPermission('Test cache no permission'))
return ts
if __name__ == '__main__':
unittest.main()

View File

@@ -0,0 +1,52 @@
"""
Tests for cache
To run all tests in suite from commandline:
python -m unittest tests.cache
Specific test class:
python -m unittest tests.cache.TestCache
"""
from tests.context import yfinance as yf
import unittest
import os
class TestCacheNoPermission(unittest.TestCase):
@classmethod
def setUpClass(cls):
if os.name == "nt": # Windows
cls.cache_path = "C:\\Windows\\System32\\yf-cache"
else: # Unix/Linux/MacOS
# Use a writable directory
cls.cache_path = "/yf-cache"
yf.set_tz_cache_location(cls.cache_path)
def test_tzCacheRootStore(self):
# Test that if cache path in read-only filesystem, no exception.
tkr = 'AMZN'
tz1 = "America/New_York"
# During attempt to store, will discover cannot write
yf.cache.get_tz_cache().store(tkr, tz1)
# Handling the store failure replaces cache with a dummy
cache = yf.cache.get_tz_cache()
self.assertTrue(cache.dummy)
cache.store(tkr, tz1)
def test_tzCacheRootLookup(self):
# Test that if cache path in read-only filesystem, no exception.
tkr = 'AMZN'
# During attempt to lookup, will discover cannot write
yf.cache.get_tz_cache().lookup(tkr)
# Handling the lookup failure replaces cache with a dummy
cache = yf.cache.get_tz_cache()
self.assertTrue(cache.dummy)
cache.lookup(tkr)
if __name__ == '__main__':
unittest.main()

69
tests/test_lookup.py Normal file
View File

@@ -0,0 +1,69 @@
import unittest
import pandas as pd
from tests.context import yfinance as yf, session_gbl
class TestLookup(unittest.TestCase):
def setUp(self):
self.query = "A" # Generic query to make sure all lookup types are returned
self.lookup = yf.Lookup(query=self.query, session=session_gbl)
def test_get_all(self):
result = self.lookup.get_all(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_get_stock(self):
result = self.lookup.get_stock(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_get_mutualfund(self):
result = self.lookup.get_mutualfund(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_get_etf(self):
result = self.lookup.get_etf(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_get_index(self):
result = self.lookup.get_index(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_get_future(self):
result = self.lookup.get_future(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_get_currency(self):
result = self.lookup.get_currency(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_get_cryptocurrency(self):
result = self.lookup.get_cryptocurrency(count=5)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 5)
def test_large_all(self):
result = self.lookup.get_all(count=1000)
self.assertIsInstance(result, pd.DataFrame)
self.assertEqual(len(result), 1000)
if __name__ == "__main__":
unittest.main()

View File

@@ -66,7 +66,7 @@ class TestPriceHistory(unittest.TestCase):
tkrs = ["IMP.JO", "BHG.JO", "SSW.JO", "BP.L", "INTC"]
for tkr in tkrs:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(proxy=None, timeout=None)
tz = dat._get_ticker_tz(timeout=None)
dt_utc = _pd.Timestamp.utcnow()
dt = dt_utc.astimezone(_tz.timezone(tz))
@@ -86,7 +86,7 @@ class TestPriceHistory(unittest.TestCase):
test_run = False
for tkr in tkrs:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(proxy=None, timeout=None)
tz = dat._get_ticker_tz(timeout=None)
dt_utc = _pd.Timestamp.utcnow()
dt = dt_utc.astimezone(_tz.timezone(tz))
@@ -112,7 +112,7 @@ class TestPriceHistory(unittest.TestCase):
test_run = False
for tkr in tkrs:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(proxy=None, timeout=None)
tz = dat._get_ticker_tz(timeout=None)
dt = _tz.timezone(tz).localize(_dt.datetime.now())
if dt.date().weekday() not in [1, 2, 3, 4]:

View File

@@ -80,8 +80,6 @@ class TestTicker(unittest.TestCase):
def setUpClass(cls):
cls.session = session_gbl
cls.proxy = None
@classmethod
def tearDownClass(cls):
if cls.session is not None:
@@ -95,7 +93,7 @@ class TestTicker(unittest.TestCase):
# Test:
dat = yf.Ticker(tkr, session=self.session)
tz = dat._get_ticker_tz(proxy=None, timeout=5)
tz = dat._get_ticker_tz(timeout=5)
self.assertIsNotNone(tz)
@@ -227,10 +225,10 @@ class TestTicker(unittest.TestCase):
def test_goodTicker_withProxy(self):
tkr = "IBM"
dat = yf.Ticker(tkr, session=self.session, proxy=self.proxy)
dat = yf.Ticker(tkr, session=self.session)
dat._fetch_ticker_tz(proxy=None, timeout=5)
dat._get_ticker_tz(proxy=None, timeout=5)
dat._fetch_ticker_tz(timeout=5)
dat._get_ticker_tz(timeout=5)
dat.history(period="5d")
for attribute_name, attribute_type in ticker_attributes:
@@ -986,6 +984,7 @@ class TestTickerInfo(unittest.TestCase):
self.symbols.append("QCSTIX") # good for testing, doesn't trade
self.symbols += ["BTC-USD", "IWO", "VFINX", "^GSPC"]
self.symbols += ["SOKE.IS", "ADS.DE"] # detected bugs
self.symbols += ["EXTO", "NEPT" ] # Issues 2343 and 2363
self.tickers = [yf.Ticker(s, session=self.session) for s in self.symbols]
def tearDown(self):
@@ -1016,6 +1015,37 @@ class TestTickerInfo(unittest.TestCase):
data2 = self.tickers[2].info
self.assertIsInstance(data2['trailingPegRatio'], float)
def test_isin_info(self):
isin_list = {"ES0137650018": True,
"does_not_exist": True, # Nonexistent but doesn't raise an error
"INF209K01EN2": True,
"INX846K01K35": False, # Nonexistent and raises an error
"INF846K01K35": True
}
for isin in isin_list:
if not isin_list[isin]:
with self.assertRaises(ValueError) as context:
ticker = yf.Ticker(isin)
self.assertIn(str(context.exception), [ f"Invalid ISIN number: {isin}", "Empty tickername" ])
else:
ticker = yf.Ticker(isin)
ticker.info
def test_empty_info(self):
# Test issue 2343 (Empty result _fetch)
data = self.tickers[10].info
self.assertCountEqual(['quoteType', 'symbol', 'underlyingSymbol', 'uuid', 'maxAge', 'trailingPegRatio'], data.keys())
self.assertIn("trailingPegRatio", data.keys(), "Did not find expected key 'trailingPegRatio' in info dict")
# Test issue 2363 (Empty QuoteResponse)
data = self.tickers[11].info
expected_keys = ['maxAge', 'priceHint', 'previousClose', 'open', 'dayLow', 'dayHigh', 'regularMarketPreviousClose',
'regularMarketOpen', 'regularMarketDayLow', 'regularMarketDayHigh', 'volume', 'regularMarketVolume',
'bid', 'ask', 'bidSize', 'askSize', 'fiftyTwoWeekLow', 'fiftyTwoWeekHigh', 'currency', 'tradeable',
'exchange', 'quoteType', 'symbol', 'underlyingSymbol', 'shortName', 'timeZoneFullName', 'timeZoneShortName',
'uuid', 'gmtOffSetMilliseconds', 'trailingPegRatio']
self.assertCountEqual(expected_keys, data.keys())
# def test_fast_info_matches_info(self):
# fast_info_keys = set()
# for ticker in self.tickers:

View File

@@ -15,7 +15,7 @@ import pandas as pd
import unittest
from yfinance.utils import is_valid_period_format
from yfinance.utils import is_valid_period_format, _dts_in_same_interval
class TestPandas(unittest.TestCase):
@@ -61,6 +61,118 @@ class TestUtils(unittest.TestCase):
self.assertTrue(is_valid_period_format("999mo")) # Large number valid
class TestDateIntervalCheck(unittest.TestCase):
def test_same_day(self):
dt1 = pd.Timestamp("2024-10-15 10:00:00")
dt2 = pd.Timestamp("2024-10-15 14:30:00")
self.assertTrue(_dts_in_same_interval(dt1, dt2, "1d"))
def test_different_days(self):
dt1 = pd.Timestamp("2024-10-15 10:00:00")
dt2 = pd.Timestamp("2024-10-16 09:00:00")
self.assertFalse(_dts_in_same_interval(dt1, dt2, "1d"))
def test_same_week_mid_week(self):
# Wednesday and Friday in same week
dt1 = pd.Timestamp("2024-10-16") # Wednesday
dt2 = pd.Timestamp("2024-10-18") # Friday
self.assertTrue(_dts_in_same_interval(dt1, dt2, "1wk"))
def test_different_weeks(self):
dt1 = pd.Timestamp("2024-10-14") # Monday week 42
dt2 = pd.Timestamp("2024-10-21") # Monday week 43
self.assertFalse(_dts_in_same_interval(dt1, dt2, "1wk"))
def test_week_year_boundary(self):
# Week 52 of 2024 spans into 2025
dt1 = pd.Timestamp("2024-12-30") # Monday in week 1 (ISO calendar)
dt2 = pd.Timestamp("2025-01-03") # Friday in week 1 (ISO calendar)
self.assertTrue(_dts_in_same_interval(dt1, dt2, "1wk"))
def test_same_month(self):
dt1 = pd.Timestamp("2024-10-01")
dt2 = pd.Timestamp("2024-10-31")
self.assertTrue(_dts_in_same_interval(dt1, dt2, "1mo"))
def test_different_months(self):
dt1 = pd.Timestamp("2024-10-31")
dt2 = pd.Timestamp("2024-11-01")
self.assertFalse(_dts_in_same_interval(dt1, dt2, "1mo"))
def test_month_year_boundary(self):
dt1 = pd.Timestamp("2024-12-15")
dt2 = pd.Timestamp("2025-01-15")
self.assertFalse(_dts_in_same_interval(dt1, dt2, "1mo"))
def test_standard_quarters(self):
q1_start = datetime(2023, 1, 1)
self.assertTrue(_dts_in_same_interval(q1_start, datetime(2023, 1, 15), '3mo'))
self.assertTrue(_dts_in_same_interval(q1_start, datetime(2023, 3, 31), '3mo'))
self.assertFalse(_dts_in_same_interval(q1_start, datetime(2023, 4, 1), '3mo'))
self.assertFalse(_dts_in_same_interval(q1_start, datetime(2022, 1, 15), '3mo')) # Previous year
self.assertFalse(_dts_in_same_interval(q1_start, datetime(2024, 1, 15), '3mo')) # Next year
q2_start = datetime(2023, 4, 1)
self.assertTrue(_dts_in_same_interval(q2_start, datetime(2023, 5, 15), '3mo'))
self.assertTrue(_dts_in_same_interval(q2_start, datetime(2023, 6, 30), '3mo'))
self.assertFalse(_dts_in_same_interval(q2_start, datetime(2023, 7, 1), '3mo'))
def test_nonstandard_quarters(self):
q1_start = datetime(2023, 2, 1)
# Same quarter
self.assertTrue(_dts_in_same_interval(q1_start, datetime(2023, 3, 1), '3mo'))
self.assertTrue(_dts_in_same_interval(q1_start, datetime(2023, 4, 25), '3mo'))
# Different quarters
self.assertFalse(_dts_in_same_interval(q1_start, datetime(2023, 1, 25), '3mo')) # Before quarter start
self.assertFalse(_dts_in_same_interval(q1_start, datetime(2023, 6, 1), '3mo')) # Start of next quarter
self.assertFalse(_dts_in_same_interval(q1_start, datetime(2023, 9, 1), '3mo')) # Start of Q3
q2_start = datetime(2023, 5, 1)
self.assertTrue(_dts_in_same_interval(q2_start, datetime(2023, 6, 1), '3mo'))
self.assertTrue(_dts_in_same_interval(q2_start, datetime(2023, 7, 25), '3mo'))
self.assertFalse(_dts_in_same_interval(q2_start, datetime(2023, 8, 1), '3mo'))
def test_cross_year_quarters(self):
q4_start = datetime(2023, 11, 1)
# Same quarter, different year
self.assertTrue(_dts_in_same_interval(q4_start, datetime(2023, 11, 15), '3mo'))
self.assertTrue(_dts_in_same_interval(q4_start, datetime(2024, 1, 15), '3mo'))
self.assertTrue(_dts_in_same_interval(q4_start, datetime(2024, 1, 25), '3mo'))
# Different quarters
self.assertFalse(_dts_in_same_interval(q4_start, datetime(2024, 2, 1), '3mo')) # Start of next quarter
self.assertFalse(_dts_in_same_interval(q4_start, datetime(2023, 10, 14), '3mo')) # Before quarter start
def test_hourly_interval(self):
dt1 = pd.Timestamp("2024-10-15 14:00:00")
dt2 = pd.Timestamp("2024-10-15 14:59:59")
self.assertTrue(_dts_in_same_interval(dt1, dt2, "1h"))
dt3 = pd.Timestamp("2024-10-15 15:00:00")
self.assertFalse(_dts_in_same_interval(dt1, dt3, "1h"))
def test_custom_intervals(self):
# Test 4 hour interval
dt1 = pd.Timestamp("2024-10-15 10:00:00")
dt2 = pd.Timestamp("2024-10-15 13:59:59")
self.assertTrue(_dts_in_same_interval(dt1, dt2, "4h"))
dt3 = pd.Timestamp("2024-10-15 14:00:00")
self.assertFalse(_dts_in_same_interval(dt1, dt3, "4h"))
def test_minute_intervals(self):
dt1 = pd.Timestamp("2024-10-15 10:30:00")
dt2 = pd.Timestamp("2024-10-15 10:30:45")
self.assertTrue(_dts_in_same_interval(dt1, dt2, "1min"))
dt3 = pd.Timestamp("2024-10-15 10:31:00")
self.assertFalse(_dts_in_same_interval(dt1, dt3, "1min"))
if __name__ == "__main__":
unittest.main()
def suite():
ts: TestSuite = unittest.TestSuite()
ts.addTest(TestPandas("Test pandas"))

View File

@@ -21,6 +21,7 @@
from . import version
from .search import Search
from .lookup import Lookup
from .ticker import Ticker
from .tickers import Tickers
from .multi import download
@@ -29,6 +30,7 @@ from .cache import set_tz_cache_location
from .domain.sector import Sector
from .domain.industry import Industry
from .domain.market import Market
from .data import YfData
from .screener.query import EquityQuery, FundQuery
from .screener.screener import screen, PREDEFINED_SCREENER_QUERIES
@@ -39,6 +41,13 @@ __author__ = "Ran Aroussi"
import warnings
warnings.filterwarnings('default', category=DeprecationWarning, module='^yfinance')
__all__ = ['download', 'Market', 'Search', 'Ticker', 'Tickers', 'enable_debug_mode', 'set_tz_cache_location', 'Sector', 'Industry']
__all__ = ['download', 'Market', 'Search', 'Lookup', 'Ticker', 'Tickers', 'enable_debug_mode', 'set_tz_cache_location', 'Sector', 'Industry']
# screener stuff:
__all__ += ['EquityQuery', 'FundQuery', 'screen', 'PREDEFINED_SCREENER_QUERIES']
__all__ += ['EquityQuery', 'FundQuery', 'screen', 'PREDEFINED_SCREENER_QUERIES']
# Config stuff:
_NOTSET=object()
def set_config(proxy=_NOTSET):
if proxy is not _NOTSET:
YfData(proxy=proxy)
__all__ += ["set_config"]

View File

@@ -40,15 +40,14 @@ from .scrapers.quote import Quote, FastInfo
from .scrapers.history import PriceHistory
from .scrapers.funds import FundsData
from .const import _BASE_URL_, _ROOT_URL_, _QUERY1_URL_
from .const import _BASE_URL_, _ROOT_URL_, _QUERY1_URL_, _SENTINEL_
_tz_info_fetch_ctr = 0
class TickerBase:
def __init__(self, ticker, session=None, proxy=None):
def __init__(self, ticker, session=None, proxy=_SENTINEL_):
self.ticker = ticker.upper()
self.proxy = proxy
self.session = session
self._tz = None
@@ -61,11 +60,21 @@ class TickerBase:
self._earnings = None
self._financials = None
# raise an error if user tries to give empty ticker
if self.ticker == "":
raise ValueError("Empty ticker name")
# accept isin as ticker
if utils.is_isin(self.ticker):
isin = self.ticker
self.ticker = utils.get_ticker_by_isin(self.ticker, None, session)
if self.ticker == "":
raise ValueError(f"Invalid ISIN number: {isin}")
self._data: YfData = YfData(session=session)
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
# self._price_history = PriceHistory(self._data, self.ticker)
self._price_history = None # lazy-load
@@ -85,11 +94,10 @@ class TickerBase:
def _lazy_load_price_history(self):
if self._price_history is None:
self._price_history = PriceHistory(self._data, self.ticker, self._get_ticker_tz(self.proxy, timeout=10))
self._price_history = PriceHistory(self._data, self.ticker, self._get_ticker_tz(timeout=10))
return self._price_history
def _get_ticker_tz(self, proxy, timeout):
proxy = proxy or self.proxy
def _get_ticker_tz(self, timeout):
if self._tz is not None:
return self._tz
c = cache.get_tz_cache()
@@ -101,7 +109,7 @@ class TickerBase:
tz = None
if tz is None:
tz = self._fetch_ticker_tz(proxy, timeout)
tz = self._fetch_ticker_tz(timeout)
if tz is None:
# _fetch_ticker_tz works in 99.999% of cases.
# For rare fail get from info.
@@ -123,9 +131,8 @@ class TickerBase:
return tz
@utils.log_indent_decorator
def _fetch_ticker_tz(self, proxy, timeout):
def _fetch_ticker_tz(self, timeout):
# Query Yahoo for fast price data just to get returned timezone
proxy = proxy or self.proxy
logger = utils.get_yf_logger()
params = {"range": "1d", "interval": "1d"}
@@ -134,7 +141,7 @@ class TickerBase:
url = f"{_BASE_URL_}/v8/finance/chart/{self.ticker}"
try:
data = self._data.cache_get(url=url, params=params, proxy=proxy, timeout=timeout)
data = self._data.cache_get(url=url, params=params, timeout=timeout)
data = data.json()
except YFRateLimitError:
# Must propagate this
@@ -158,95 +165,136 @@ class TickerBase:
logger.debug("-------------")
return None
def get_recommendations(self, proxy=None, as_dict=False):
def get_recommendations(self, proxy=_SENTINEL_, as_dict=False):
"""
Returns a DataFrame with the recommendations
Columns: period strongBuy buy hold sell strongSell
"""
self._quote.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._quote.recommendations
if as_dict:
return data.to_dict()
return data
def get_recommendations_summary(self, proxy=None, as_dict=False):
return self.get_recommendations(proxy=proxy, as_dict=as_dict)
def get_recommendations_summary(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
def get_upgrades_downgrades(self, proxy=None, as_dict=False):
return self.get_recommendations(as_dict=as_dict)
def get_upgrades_downgrades(self, proxy=_SENTINEL_, as_dict=False):
"""
Returns a DataFrame with the recommendations changes (upgrades/downgrades)
Index: date of grade
Columns: firm toGrade fromGrade action
"""
self._quote.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._quote.upgrades_downgrades
if as_dict:
return data.to_dict()
return data
def get_calendar(self, proxy=None) -> dict:
self._quote.proxy = proxy or self.proxy
def get_calendar(self, proxy=_SENTINEL_) -> dict:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self._quote.calendar
def get_sec_filings(self, proxy=None) -> dict:
self._quote.proxy = proxy or self.proxy
def get_sec_filings(self, proxy=_SENTINEL_) -> dict:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self._quote.sec_filings
def get_major_holders(self, proxy=None, as_dict=False):
self._holders.proxy = proxy or self.proxy
def get_major_holders(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._holders.major
if as_dict:
return data.to_dict()
return data
def get_institutional_holders(self, proxy=None, as_dict=False):
self._holders.proxy = proxy or self.proxy
def get_institutional_holders(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._holders.institutional
if data is not None:
if as_dict:
return data.to_dict()
return data
def get_mutualfund_holders(self, proxy=None, as_dict=False):
self._holders.proxy = proxy or self.proxy
def get_mutualfund_holders(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._holders.mutualfund
if data is not None:
if as_dict:
return data.to_dict()
return data
def get_insider_purchases(self, proxy=None, as_dict=False):
self._holders.proxy = proxy or self.proxy
def get_insider_purchases(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._holders.insider_purchases
if data is not None:
if as_dict:
return data.to_dict()
return data
def get_insider_transactions(self, proxy=None, as_dict=False):
self._holders.proxy = proxy or self.proxy
def get_insider_transactions(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._holders.insider_transactions
if data is not None:
if as_dict:
return data.to_dict()
return data
def get_insider_roster_holders(self, proxy=None, as_dict=False):
self._holders.proxy = proxy or self.proxy
def get_insider_roster_holders(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._holders.insider_roster
if data is not None:
if as_dict:
return data.to_dict()
return data
def get_info(self, proxy=None) -> dict:
self._quote.proxy = proxy or self.proxy
def get_info(self, proxy=_SENTINEL_) -> dict:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._quote.info
return data
def get_fast_info(self, proxy=None):
def get_fast_info(self, proxy=_SENTINEL_):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
if self._fast_info is None:
self._fast_info = FastInfo(self, proxy=proxy)
self._fast_info = FastInfo(self)
return self._fast_info
@property
@@ -254,76 +302,100 @@ class TickerBase:
warnings.warn("'Ticker.basic_info' is deprecated and will be removed in future, Switch to 'Ticker.fast_info'", DeprecationWarning)
return self.fast_info
def get_sustainability(self, proxy=None, as_dict=False):
self._quote.proxy = proxy or self.proxy
def get_sustainability(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._quote.sustainability
if as_dict:
return data.to_dict()
return data
def get_analyst_price_targets(self, proxy=None) -> dict:
def get_analyst_price_targets(self, proxy=_SENTINEL_) -> dict:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
"""
Keys: current low high mean median
"""
self._analysis.proxy = proxy or self.proxy
data = self._analysis.analyst_price_targets
return data
def get_earnings_estimate(self, proxy=None, as_dict=False):
def get_earnings_estimate(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
"""
Index: 0q +1q 0y +1y
Columns: numberOfAnalysts avg low high yearAgoEps growth
"""
self._analysis.proxy = proxy or self.proxy
data = self._analysis.earnings_estimate
return data.to_dict() if as_dict else data
def get_revenue_estimate(self, proxy=None, as_dict=False):
def get_revenue_estimate(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
"""
Index: 0q +1q 0y +1y
Columns: numberOfAnalysts avg low high yearAgoRevenue growth
"""
self._analysis.proxy = proxy or self.proxy
data = self._analysis.revenue_estimate
return data.to_dict() if as_dict else data
def get_earnings_history(self, proxy=None, as_dict=False):
def get_earnings_history(self, proxy=_SENTINEL_, as_dict=False):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
"""
Index: pd.DatetimeIndex
Columns: epsEstimate epsActual epsDifference surprisePercent
"""
self._analysis.proxy = proxy or self.proxy
data = self._analysis.earnings_history
return data.to_dict() if as_dict else data
def get_eps_trend(self, proxy=None, as_dict=False):
def get_eps_trend(self, proxy=_SENTINEL_, as_dict=False):
"""
Index: 0q +1q 0y +1y
Columns: current 7daysAgo 30daysAgo 60daysAgo 90daysAgo
"""
self._analysis.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._analysis.eps_trend
return data.to_dict() if as_dict else data
def get_eps_revisions(self, proxy=None, as_dict=False):
def get_eps_revisions(self, proxy=_SENTINEL_, as_dict=False):
"""
Index: 0q +1q 0y +1y
Columns: upLast7days upLast30days downLast7days downLast30days
"""
self._analysis.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._analysis.eps_revisions
return data.to_dict() if as_dict else data
def get_growth_estimates(self, proxy=None, as_dict=False):
def get_growth_estimates(self, proxy=_SENTINEL_, as_dict=False):
"""
Index: 0q +1q 0y +1y +5y -5y
Columns: stock industry sector index
"""
self._analysis.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._analysis.growth_estimates
return data.to_dict() if as_dict else data
def get_earnings(self, proxy=None, as_dict=False, freq="yearly"):
def get_earnings(self, proxy=_SENTINEL_, as_dict=False, freq="yearly"):
"""
:Parameters:
as_dict: bool
@@ -332,11 +404,11 @@ class TickerBase:
freq: str
"yearly" or "quarterly" or "trailing"
Default is "yearly"
proxy: str
Optional. Proxy server URL scheme
Default is None
"""
self._fundamentals.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
if self._fundamentals.earnings is None:
return None
data = self._fundamentals.earnings[freq]
@@ -347,7 +419,7 @@ class TickerBase:
return dict_data
return data
def get_income_stmt(self, proxy=None, as_dict=False, pretty=False, freq="yearly"):
def get_income_stmt(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
"""
:Parameters:
as_dict: bool
@@ -359,13 +431,12 @@ class TickerBase:
freq: str
"yearly" or "quarterly" or "trailing"
Default is "yearly"
proxy: str
Optional. Proxy server URL scheme
Default is None
"""
self._fundamentals.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._fundamentals.financials.get_income_time_series(freq=freq, proxy=proxy)
data = self._fundamentals.financials.get_income_time_series(freq=freq)
if pretty:
data = data.copy()
@@ -374,13 +445,21 @@ class TickerBase:
return data.to_dict()
return data
def get_incomestmt(self, proxy=None, as_dict=False, pretty=False, freq="yearly"):
def get_incomestmt(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self.get_income_stmt(proxy, as_dict, pretty, freq)
def get_financials(self, proxy=None, as_dict=False, pretty=False, freq="yearly"):
def get_financials(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self.get_income_stmt(proxy, as_dict, pretty, freq)
def get_balance_sheet(self, proxy=None, as_dict=False, pretty=False, freq="yearly"):
def get_balance_sheet(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
"""
:Parameters:
as_dict: bool
@@ -392,13 +471,13 @@ class TickerBase:
freq: str
"yearly" or "quarterly"
Default is "yearly"
proxy: str
Optional. Proxy server URL scheme
Default is None
"""
self._fundamentals.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._fundamentals.financials.get_balance_sheet_time_series(freq=freq, proxy=proxy)
data = self._fundamentals.financials.get_balance_sheet_time_series(freq=freq)
if pretty:
data = data.copy()
@@ -407,10 +486,14 @@ class TickerBase:
return data.to_dict()
return data
def get_balancesheet(self, proxy=None, as_dict=False, pretty=False, freq="yearly"):
def get_balancesheet(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self.get_balance_sheet(proxy, as_dict, pretty, freq)
def get_cash_flow(self, proxy=None, as_dict=False, pretty=False, freq="yearly") -> Union[pd.DataFrame, dict]:
def get_cash_flow(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly") -> Union[pd.DataFrame, dict]:
"""
:Parameters:
as_dict: bool
@@ -422,13 +505,13 @@ class TickerBase:
freq: str
"yearly" or "quarterly"
Default is "yearly"
proxy: str
Optional. Proxy server URL scheme
Default is None
"""
self._fundamentals.proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = self._fundamentals.financials.get_cash_flow_time_series(freq=freq, proxy=proxy)
data = self._fundamentals.financials.get_cash_flow_time_series(freq=freq)
if pretty:
data = data.copy()
@@ -437,34 +520,56 @@ class TickerBase:
return data.to_dict()
return data
def get_cashflow(self, proxy=None, as_dict=False, pretty=False, freq="yearly"):
def get_cashflow(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self.get_cash_flow(proxy, as_dict, pretty, freq)
def get_dividends(self, proxy=None, period="max") -> pd.Series:
return self._lazy_load_price_history().get_dividends(period=period, proxy=proxy)
def get_dividends(self, proxy=_SENTINEL_, period="max") -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self._lazy_load_price_history().get_dividends(period=period)
def get_capital_gains(self, proxy=None, period="max") -> pd.Series:
return self._lazy_load_price_history().get_capital_gains(period=period, proxy=proxy)
def get_capital_gains(self, proxy=_SENTINEL_, period="max") -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self._lazy_load_price_history().get_capital_gains(period=period)
def get_splits(self, proxy=None, period="max") -> pd.Series:
return self._lazy_load_price_history().get_splits(period=period, proxy=proxy)
def get_splits(self, proxy=_SENTINEL_, period="max") -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self._lazy_load_price_history().get_splits(period=period)
def get_actions(self, proxy=None, period="max") -> pd.Series:
return self._lazy_load_price_history().get_actions(period=period, proxy=proxy)
def get_actions(self, proxy=_SENTINEL_, period="max") -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self._lazy_load_price_history().get_actions(period=period)
def get_shares(self, proxy=_SENTINEL_, as_dict=False) -> Union[pd.DataFrame, dict]:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
def get_shares(self, proxy=None, as_dict=False) -> Union[pd.DataFrame, dict]:
self._fundamentals.proxy = proxy or self.proxy
data = self._fundamentals.shares
if as_dict:
return data.to_dict()
return data
@utils.log_indent_decorator
def get_shares_full(self, start=None, end=None, proxy=None):
def get_shares_full(self, start=None, end=None, proxy=_SENTINEL_):
logger = utils.get_yf_logger()
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
# Process dates
tz = self._get_ticker_tz(proxy=proxy, timeout=10)
tz = self._get_ticker_tz(timeout=10)
dt_now = pd.Timestamp.utcnow().tz_convert(tz)
if start is not None:
start_ts = utils._parse_user_dt(start, tz)
@@ -486,7 +591,7 @@ class TickerBase:
ts_url_base = f"https://query2.finance.yahoo.com/ws/fundamentals-timeseries/v1/finance/timeseries/{self.ticker}?symbol={self.ticker}"
shares_url = f"{ts_url_base}&period1={int(start.timestamp())}&period2={int(end.timestamp())}"
try:
json_data = self._data.cache_get(url=shares_url, proxy=proxy)
json_data = self._data.cache_get(url=shares_url)
json_data = json_data.json()
except (_json.JSONDecodeError, requests.exceptions.RequestException):
logger.error(f"{self.ticker}: Yahoo web request for share count failed")
@@ -512,7 +617,11 @@ class TickerBase:
df = df.sort_index()
return df
def get_isin(self, proxy=None) -> Optional[str]:
def get_isin(self, proxy=_SENTINEL_) -> Optional[str]:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
# *** experimental ***
if self._isin is not None:
return self._isin
@@ -525,7 +634,6 @@ class TickerBase:
q = ticker
self._quote.proxy = proxy or self.proxy
if self._quote.info is None:
# Don't print error message cause self._quote.info will print one
return None
@@ -533,7 +641,7 @@ class TickerBase:
q = self._quote.info['shortName']
url = f'https://markets.businessinsider.com/ajax/SearchController_Suggest?max_results=25&query={urlencode(q)}'
data = self._data.cache_get(url=url, proxy=proxy).text
data = self._data.cache_get(url=url).text
search_str = f'"{ticker}|'
if search_str not in data:
@@ -549,13 +657,17 @@ class TickerBase:
self._isin = data.split(search_str)[1].split('"')[0].split('|')[0]
return self._isin
def get_news(self, count=10, tab="news", proxy=None) -> list:
def get_news(self, count=10, tab="news", proxy=_SENTINEL_) -> list:
"""Allowed options for tab: "news", "all", "press releases"""
if self._news:
return self._news
logger = utils.get_yf_logger()
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
tab_queryrefs = {
"all": "newsAll",
"news": "latestNews",
@@ -574,7 +686,7 @@ class TickerBase:
}
}
data = self._data.post(url, body=payload, proxy=proxy)
data = self._data.post(url, body=payload)
if data is None or "Will be right back" in data.text:
raise RuntimeError("*** YAHOO! FINANCE IS CURRENTLY DOWN! ***\n"
"Our engineers are working quickly to resolve "
@@ -591,7 +703,7 @@ class TickerBase:
return self._news
@utils.log_indent_decorator
def get_earnings_dates(self, limit=12, proxy=None) -> Optional[pd.DataFrame]:
def get_earnings_dates(self, limit=12, proxy=_SENTINEL_) -> Optional[pd.DataFrame]:
"""
Get earning dates (future and historic)
@@ -599,12 +711,15 @@ class TickerBase:
limit (int): max amount of upcoming and recent earnings dates to return.
Default value 12 should return next 4 quarters and last 8 quarters.
Increase if more history is needed.
proxy: requests proxy to use.
Returns:
pd.DataFrame
"""
logger = utils.get_yf_logger()
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
clamped_limit = min(limit, 100) # YF caps at 100, don't go higher
if self._earnings_dates and clamped_limit in self._earnings_dates:
@@ -627,7 +742,7 @@ class TickerBase:
"entityIdType": "earnings",
"includeFields": ["startdatetime", "timeZoneShortName", "epsestimate", "epsactual", "epssurprisepct"]
}
response = self._data.post(url, params=params, body=body, proxy=proxy)
response = self._data.post(url, params=params, body=body)
json_data = response.json()
# Extract data
@@ -643,7 +758,7 @@ class TickerBase:
# Calculate earnings date
df['Earnings Date'] = pd.to_datetime(df['Event Start Date'])
tz = self._get_ticker_tz(proxy=proxy, timeout=30)
tz = self._get_ticker_tz(timeout=30)
if df['Earnings Date'].dt.tz is None:
df['Earnings Date'] = df['Earnings Date'].dt.tz_localize(tz)
else:
@@ -661,10 +776,18 @@ class TickerBase:
self._earnings_dates[clamped_limit] = df
return df
def get_history_metadata(self, proxy=None) -> dict:
def get_history_metadata(self, proxy=_SENTINEL_) -> dict:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self._lazy_load_price_history().get_history_metadata(proxy)
def get_funds_data(self, proxy=None) -> Optional[FundsData]:
def get_funds_data(self, proxy=_SENTINEL_) -> Optional[FundsData]:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
if not self._funds_data:
self._funds_data = FundsData(self._data, self.ticker)

View File

@@ -2,6 +2,8 @@ _QUERY1_URL_ = 'https://query1.finance.yahoo.com'
_BASE_URL_ = 'https://query2.finance.yahoo.com'
_ROOT_URL_ = 'https://finance.yahoo.com'
_SENTINEL_ = object()
fundamentals_keys = {
'financials': ["TaxEffectOfUnusualItems", "TaxRateForCalcs", "NormalizedEBITDA", "NormalizedDilutedEPS",
"NormalizedBasicEPS", "TotalUnusualItems", "TotalUnusualItemsExcludingGoodwill",

View File

@@ -51,7 +51,13 @@ class SingletonMeta(type):
instance = super().__call__(*args, **kwargs)
cls._instances[cls] = instance
else:
cls._instances[cls]._set_session(*args, **kwargs)
# Update the existing instance
if 'session' in kwargs or (args and len(args) > 0):
session = kwargs.get('session') if 'session' in kwargs else args[0]
cls._instances[cls]._set_session(session)
if 'proxy' in kwargs or (args and len(args) > 1):
proxy = kwargs.get('proxy') if 'proxy' in kwargs else args[1]
cls._instances[cls]._set_proxy(proxy)
return cls._instances[cls]
@@ -64,7 +70,7 @@ class YfData(metaclass=SingletonMeta):
'User-Agent': random.choice(USER_AGENTS)
}
def __init__(self, session=None):
def __init__(self, session=None, proxy=None):
self._crumb = None
self._cookie = None
@@ -75,7 +81,9 @@ class YfData(metaclass=SingletonMeta):
self._cookie_lock = threading.Lock()
self._session, self._proxy = None, None
self._set_session(session or requests.Session())
self._set_proxy(proxy)
utils.get_yf_logger().debug(f"Using User-Agent: {self.user_agent_headers['User-Agent']}")
@@ -84,6 +92,8 @@ class YfData(metaclass=SingletonMeta):
return
with self._cookie_lock:
self._session = session
if self._proxy is not None:
self._session.proxies = self._proxy
try:
self._session.cache
@@ -98,6 +108,15 @@ class YfData(metaclass=SingletonMeta):
from requests_cache import DO_NOT_CACHE
self._expire_after = DO_NOT_CACHE
def _set_proxy(self, proxy=None):
with self._cookie_lock:
if proxy is not None:
proxy = {'http': proxy, 'https': proxy} if isinstance(proxy, str) else proxy
else:
proxy = {}
self._proxy = proxy
self._session.proxies = proxy
def _set_cookie_strategy(self, strategy, have_lock=False):
if strategy == self._cookie_strategy:
return
@@ -154,7 +173,7 @@ class YfData(metaclass=SingletonMeta):
utils.get_yf_logger().debug('loaded persistent cookie')
return cookie_dict['cookie']
def _get_cookie_basic(self, proxy=None, timeout=30):
def _get_cookie_basic(self, timeout=30):
if self._cookie is not None:
utils.get_yf_logger().debug('reusing cookie')
return self._cookie
@@ -168,7 +187,6 @@ class YfData(metaclass=SingletonMeta):
response = self._session.get(
url='https://fc.yahoo.com',
headers=self.user_agent_headers,
proxies=proxy,
timeout=timeout,
allow_redirects=True)
@@ -183,7 +201,7 @@ class YfData(metaclass=SingletonMeta):
utils.get_yf_logger().debug(f"fetched basic cookie = {self._cookie}")
return self._cookie
def _get_crumb_basic(self, proxy=None, timeout=30):
def _get_crumb_basic(self, timeout=30):
if self._crumb is not None:
utils.get_yf_logger().debug('reusing crumb')
return self._crumb
@@ -197,7 +215,6 @@ class YfData(metaclass=SingletonMeta):
'url': "https://query1.finance.yahoo.com/v1/test/getcrumb",
'headers': self.user_agent_headers,
'cookies': {cookie.name: cookie.value},
'proxies': proxy,
'timeout': timeout,
'allow_redirects': True
}
@@ -215,12 +232,12 @@ class YfData(metaclass=SingletonMeta):
return self._crumb
@utils.log_indent_decorator
def _get_cookie_and_crumb_basic(self, proxy, timeout):
cookie = self._get_cookie_basic(proxy, timeout)
crumb = self._get_crumb_basic(proxy, timeout)
def _get_cookie_and_crumb_basic(self, timeout):
cookie = self._get_cookie_basic(timeout)
crumb = self._get_crumb_basic(timeout)
return cookie, crumb
def _get_cookie_csrf(self, proxy, timeout):
def _get_cookie_csrf(self, timeout):
if self._cookie is not None:
utils.get_yf_logger().debug('reusing cookie')
return True
@@ -232,7 +249,6 @@ class YfData(metaclass=SingletonMeta):
base_args = {
'headers': self.user_agent_headers,
'proxies': proxy,
'timeout': timeout}
get_args = {**base_args, 'url': 'https://guce.yahoo.com/consent'}
@@ -291,21 +307,20 @@ class YfData(metaclass=SingletonMeta):
return True
@utils.log_indent_decorator
def _get_crumb_csrf(self, proxy=None, timeout=30):
def _get_crumb_csrf(self, timeout=30):
# Credit goes to @bot-unit #1729
if self._crumb is not None:
utils.get_yf_logger().debug('reusing crumb')
return self._crumb
if not self._get_cookie_csrf(proxy, timeout):
if not self._get_cookie_csrf(timeout):
# This cookie stored in session
return None
get_args = {
'url': 'https://query2.finance.yahoo.com/v1/test/getcrumb',
'headers': self.user_agent_headers,
'proxies': proxy,
'timeout': timeout}
if self._session_is_caching:
get_args['expire_after'] = self._expire_after
@@ -322,7 +337,7 @@ class YfData(metaclass=SingletonMeta):
return self._crumb
@utils.log_indent_decorator
def _get_cookie_and_crumb(self, proxy=None, timeout=30):
def _get_cookie_and_crumb(self, timeout=30):
cookie, crumb, strategy = None, None, None
utils.get_yf_logger().debug(f"cookie_mode = '{self._cookie_strategy}'")
@@ -333,10 +348,10 @@ class YfData(metaclass=SingletonMeta):
if crumb is None:
# Fail
self._set_cookie_strategy('basic', have_lock=True)
cookie, crumb = self._get_cookie_and_crumb_basic(proxy, timeout)
cookie, crumb = self._get_cookie_and_crumb_basic(timeout)
else:
# Fallback strategy
cookie, crumb = self._get_cookie_and_crumb_basic(proxy, timeout)
cookie, crumb = self._get_cookie_and_crumb_basic(timeout)
if cookie is None or crumb is None:
# Fail
self._set_cookie_strategy('csrf', have_lock=True)
@@ -345,15 +360,15 @@ class YfData(metaclass=SingletonMeta):
return cookie, crumb, strategy
@utils.log_indent_decorator
def get(self, url, user_agent_headers=None, params=None, proxy=None, timeout=30):
return self._make_request(url, request_method = self._session.get, user_agent_headers=user_agent_headers, params=params, proxy=proxy, timeout=timeout)
def get(self, url, user_agent_headers=None, params=None, timeout=30):
return self._make_request(url, request_method = self._session.get, user_agent_headers=user_agent_headers, params=params, timeout=timeout)
@utils.log_indent_decorator
def post(self, url, body, user_agent_headers=None, params=None, proxy=None, timeout=30):
return self._make_request(url, request_method = self._session.post, user_agent_headers=user_agent_headers, body=body, params=params, proxy=proxy, timeout=timeout)
def post(self, url, body, user_agent_headers=None, params=None, timeout=30):
return self._make_request(url, request_method = self._session.post, user_agent_headers=user_agent_headers, body=body, params=params, timeout=timeout)
@utils.log_indent_decorator
def _make_request(self, url, request_method, user_agent_headers=None, body=None, params=None, proxy=None, timeout=30):
def _make_request(self, url, request_method, user_agent_headers=None, body=None, params=None, timeout=30):
# Important: treat input arguments as immutable.
if len(url) > 200:
@@ -361,7 +376,6 @@ class YfData(metaclass=SingletonMeta):
else:
utils.get_yf_logger().debug(f'url={url}')
utils.get_yf_logger().debug(f'params={params}')
proxy = self._get_proxy(proxy)
if params is None:
params = {}
@@ -383,7 +397,6 @@ class YfData(metaclass=SingletonMeta):
'url': url,
'params': {**params, **crumbs},
'cookies': cookies,
'proxies': proxy,
'timeout': timeout,
'headers': user_agent_headers or self.user_agent_headers
}
@@ -399,7 +412,7 @@ class YfData(metaclass=SingletonMeta):
self._set_cookie_strategy('csrf')
else:
self._set_cookie_strategy('basic')
cookie, crumb, strategy = self._get_cookie_and_crumb(proxy, timeout)
cookie, crumb, strategy = self._get_cookie_and_crumb(timeout)
request_args['params']['crumb'] = crumb
if strategy == 'basic':
request_args['cookies'] = {cookie.name: cookie.value}
@@ -414,19 +427,11 @@ class YfData(metaclass=SingletonMeta):
@lru_cache_freezeargs
@lru_cache(maxsize=cache_maxsize)
def cache_get(self, url, user_agent_headers=None, params=None, proxy=None, timeout=30):
return self.get(url, user_agent_headers, params, proxy, timeout)
def cache_get(self, url, user_agent_headers=None, params=None, timeout=30):
return self.get(url, user_agent_headers, params, timeout)
def _get_proxy(self, proxy):
# setup proxy in requests format
if proxy is not None:
if isinstance(proxy, (dict, frozendict)) and "https" in proxy:
proxy = proxy["https"]
proxy = {"https": proxy}
return proxy
def get_raw_json(self, url, user_agent_headers=None, params=None, proxy=None, timeout=30):
def get_raw_json(self, url, user_agent_headers=None, params=None, timeout=30):
utils.get_yf_logger().debug(f'get_raw_json(): {url}')
response = self.get(url, user_agent_headers=user_agent_headers, params=params, proxy=proxy, timeout=timeout)
response = self.get(url, user_agent_headers=user_agent_headers, params=params, timeout=timeout)
response.raise_for_status()
return response.json()

View File

@@ -1,7 +1,8 @@
from abc import ABC, abstractmethod
from ..ticker import Ticker
from ..const import _QUERY1_URL_
from ..const import _QUERY1_URL_, _SENTINEL_
from ..data import YfData
from ..utils import print_once
from typing import Dict, List, Optional
import pandas as _pd
@@ -13,20 +14,21 @@ class Domain(ABC):
and methods for fetching and parsing data. Derived classes must implement the `_fetch_and_parse()` method.
"""
def __init__(self, key: str, session=None, proxy=None):
def __init__(self, key: str, session=None, proxy=_SENTINEL_):
"""
Initializes the Domain object with a key, session, and proxy.
Args:
key (str): Unique key identifying the domain entity.
session (Optional[requests.Session]): Session object for HTTP requests. Defaults to None.
proxy (Optional[Dict]): Proxy settings. Defaults to None.
"""
self._key: str = key
self.proxy = proxy
self.session = session
self._data: YfData = YfData(session=session)
if proxy is not _SENTINEL_:
print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
self._name: Optional[str] = None
self._symbol: Optional[str] = None
self._overview: Optional[Dict] = None
@@ -109,19 +111,18 @@ class Domain(ABC):
self._ensure_fetched(self._research_reports)
return self._research_reports
def _fetch(self, query_url, proxy) -> Dict:
def _fetch(self, query_url) -> Dict:
"""
Fetches data from the given query URL.
Args:
query_url (str): The URL used for the data query.
proxy (Dict): Proxy settings for the request.
Returns:
Dict: The JSON response data from the request.
"""
params_dict = {"formatted": "true", "withReturns": "true", "lang": "en-US", "region": "US"}
result = self._data.get_raw_json(query_url, user_agent_headers=self._data.user_agent_headers, params=params_dict, proxy=proxy)
result = self._data.get_raw_json(query_url, user_agent_headers=self._data.user_agent_headers, params=params_dict)
return result
def _parse_and_assign_common(self, data) -> None:
@@ -194,4 +195,4 @@ class Domain(ABC):
attribute: The attribute to check and potentially fetch.
"""
if attribute is None:
self._fetch_and_parse()
self._fetch_and_parse()

View File

@@ -5,20 +5,24 @@ import pandas as _pd
from .domain import Domain, _QUERY_URL_
from .. import utils
from ..data import YfData
from ..const import _SENTINEL_
class Industry(Domain):
"""
Represents an industry within a sector.
"""
def __init__(self, key, session=None, proxy=None):
def __init__(self, key, session=None, proxy=_SENTINEL_):
"""
Args:
key (str): The key identifier for the industry.
session (optional): The session to use for requests.
proxy (optional): The proxy to use for requests.
"""
super(Industry, self).__init__(key, session, proxy)
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
YfData(session=session, proxy=proxy)
super(Industry, self).__init__(key, session)
self._query_url = f'{_QUERY_URL_}/industries/{self._key}'
self._sector_key = None
@@ -129,7 +133,7 @@ class Industry(Domain):
result = None
try:
result = self._fetch(self._query_url, self.proxy)
result = self._fetch(self._query_url)
data = result['data']
self._parse_and_assign_common(data)
@@ -145,4 +149,4 @@ class Industry(Domain):
logger.debug("Got response: ")
logger.debug("-------------")
logger.debug(f" {result}")
logger.debug("-------------")
logger.debug("-------------")

View File

@@ -2,24 +2,27 @@ import datetime as dt
from ..data import YfData
from ..data import utils
from ..const import _QUERY1_URL_
from ..const import _QUERY1_URL_, _SENTINEL_
import json as _json
class Market:
def __init__(self, market:'str', session=None, proxy=None, timeout=30):
def __init__(self, market:'str', session=None, proxy=_SENTINEL_, timeout=30):
self.market = market
self.session = session
self.proxy = proxy
self.timeout = timeout
self._data = YfData(session=self.session)
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
self._logger = utils.get_yf_logger()
self._status = None
self._summary = None
def _fetch_json(self, url, params):
data = self._data.cache_get(url=url, params=params, proxy=self.proxy, timeout=self.timeout)
data = self._data.cache_get(url=url, params=params, timeout=self.timeout)
if data is None or "Will be right back" in data.text:
raise RuntimeError("*** YAHOO! FINANCE IS CURRENTLY DOWN! ***\n"
"Our engineers are working quickly to resolve "

View File

@@ -1,12 +1,13 @@
from __future__ import print_function
from typing import Dict, Optional
from ..utils import dynamic_docstring, generate_list_table_from_dict
from ..const import SECTOR_INDUSTY_MAPPING
from ..const import SECTOR_INDUSTY_MAPPING, _SENTINEL_
import pandas as _pd
from .domain import Domain, _QUERY_URL_
from .. import utils
from ..data import YfData
class Sector(Domain):
"""
@@ -14,7 +15,7 @@ class Sector(Domain):
such as top ETFs, top mutual funds, and industry data.
"""
def __init__(self, key, session=None, proxy=None):
def __init__(self, key, session=None, proxy=_SENTINEL_):
"""
Args:
key (str): The key representing the sector.
@@ -26,7 +27,11 @@ class Sector(Domain):
:attr:`Sector.industries <yfinance.Sector.industries>`
Map of sector and industry
"""
super(Sector, self).__init__(key, session, proxy)
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
YfData(session=session, proxy=proxy)
super(Sector, self).__init__(key, session)
self._query_url: str = f'{_QUERY_URL_}/sectors/{self._key}'
self._top_etfs: Optional[Dict] = None
self._top_mutual_funds: Optional[Dict] = None
@@ -133,7 +138,7 @@ class Sector(Domain):
result = None
try:
result = self._fetch(self._query_url, self.proxy)
result = self._fetch(self._query_url)
data = result['data']
self._parse_and_assign_common(data)
@@ -147,4 +152,4 @@ class Sector(Domain):
logger.debug("Got response: ")
logger.debug("-------------")
logger.debug(f" {result}")
logger.debug("-------------")
logger.debug("-------------")

224
yfinance/lookup.py Normal file
View File

@@ -0,0 +1,224 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# yfinance - market data downloader
# https://github.com/ranaroussi/yfinance
#
# Copyright 2017-2019 Ran Aroussi
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import json as _json
import pandas as pd
from . import utils
from .const import _QUERY1_URL_, _SENTINEL_
from .data import YfData
from .exceptions import YFException
LOOKUP_TYPES = ["all", "equity", "mutualfund", "etf", "index", "future", "currency", "cryptocurrency"]
class Lookup:
"""
Fetches quote (ticker) lookups from Yahoo Finance.
:param query: The search query for financial data lookup.
:type query: str
:param session: Custom HTTP session for requests (default None).
:param proxy: Proxy settings for requests (default None).
:param timeout: Request timeout in seconds (default 30).
:param raise_errors: Raise exceptions on error (default True).
"""
def __init__(self, query: str, session=None, proxy=None, timeout=30, raise_errors=True):
self.session = session
self._data = YfData(session=self.session)
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
self.query = query
self.timeout = timeout
self.raise_errors = raise_errors
self._logger = utils.get_yf_logger()
self._cache = {}
def _fetch_lookup(self, lookup_type="all", count=25) -> dict:
cache_key = (lookup_type, count)
if cache_key in self._cache:
return self._cache[cache_key]
url = f"{_QUERY1_URL_}/v1/finance/lookup"
params = {
"query": self.query,
"type": lookup_type,
"start": 0,
"count": count,
"formatted": False,
"fetchPricingData": True,
"lang": "en-US",
"region": "US"
}
self._logger.debug(f'GET Lookup for ticker ({self.query}) with parameters: {str(dict(params))}')
data = self._data.get(url=url, params=params, timeout=self.timeout)
if data is None or "Will be right back" in data.text:
raise RuntimeError("*** YAHOO! FINANCE IS CURRENTLY DOWN! ***\n"
"Our engineers are working quickly to resolve "
"the issue. Thank you for your patience.")
try:
data = data.json()
except _json.JSONDecodeError:
self._logger.error(f"{self.query}: Failed to retrieve lookup results and received faulty response instead.")
data = {}
# Error returned
if data.get("finance", {}).get("error", {}):
raise YFException(data.get("finance", {}).get("error", {}))
self._cache[cache_key] = data
return data
@staticmethod
def _parse_response(response: dict) -> pd.DataFrame:
finance = response.get("finance", {})
result = finance.get("result", [])
result = result[0] if len(result) > 0 else {}
documents = result.get("documents", [])
df = pd.DataFrame(documents)
if "symbol" not in df.columns:
return pd.DataFrame()
return df.set_index("symbol")
def _get_data(self, lookup_type: str, count: int = 25) -> pd.DataFrame:
return self._parse_response(self._fetch_lookup(lookup_type, count))
def get_all(self, count=25) -> pd.DataFrame:
"""
Returns all available financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("all", count)
def get_stock(self, count=25) -> pd.DataFrame:
"""
Returns stock related financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("equity", count)
def get_mutualfund(self, count=25) -> pd.DataFrame:
"""
Returns mutual funds related financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("mutualfund", count)
def get_etf(self, count=25) -> pd.DataFrame:
"""
Returns ETFs related financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("etf", count)
def get_index(self, count=25) -> pd.DataFrame:
"""
Returns Indices related financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("index", count)
def get_future(self, count=25) -> pd.DataFrame:
"""
Returns Futures related financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("future", count)
def get_currency(self, count=25) -> pd.DataFrame:
"""
Returns Currencies related financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("currency", count)
def get_cryptocurrency(self, count=25) -> pd.DataFrame:
"""
Returns Cryptocurrencies related financial instruments.
:param count: The number of results to retrieve.
:type count: int
"""
return self._get_data("cryptocurrency", count)
@property
def all(self) -> pd.DataFrame:
"""Returns all available financial instruments."""
return self._get_data("all")
@property
def stock(self) -> pd.DataFrame:
"""Returns stock related financial instruments."""
return self._get_data("equity")
@property
def mutualfund(self) -> pd.DataFrame:
"""Returns mutual funds related financial instruments."""
return self._get_data("mutualfund")
@property
def etf(self) -> pd.DataFrame:
"""Returns ETFs related financial instruments."""
return self._get_data("etf")
@property
def index(self) -> pd.DataFrame:
"""Returns Indices related financial instruments."""
return self._get_data("index")
@property
def future(self) -> pd.DataFrame:
"""Returns Futures related financial instruments."""
return self._get_data("future")
@property
def currency(self) -> pd.DataFrame:
"""Returns Currencies related financial instruments."""
return self._get_data("currency")
@property
def cryptocurrency(self) -> pd.DataFrame:
"""Returns Cryptocurrencies related financial instruments."""
return self._get_data("cryptocurrency")

View File

@@ -32,13 +32,13 @@ import pandas as _pd
from . import Ticker, utils
from .data import YfData
from . import shared
from .const import _SENTINEL_
@utils.log_indent_decorator
def download(tickers, start=None, end=None, actions=False, threads=True,
ignore_tz=None, group_by='column', auto_adjust=None, back_adjust=False,
repair=False, keepna=False, progress=True, period="max", interval="1d",
prepost=False, proxy=None, rounding=False, timeout=10, session=None,
prepost=False, proxy=_SENTINEL_, rounding=False, timeout=10, session=None,
multi_level_index=True) -> Union[_pd.DataFrame, None]:
"""
Download yahoo tickers
@@ -79,8 +79,6 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
ignore_tz: bool
When combining from different timezones, ignore that part of datetime.
Default depends on interval. Intraday = False. Day+ = True.
proxy: str
Optional. Proxy server URL scheme. Default is None
rounding: bool
Optional. Round values to 2 decimal places?
timeout: None or float
@@ -127,7 +125,7 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
for ticker in tickers:
if utils.is_isin(ticker):
isin = ticker
ticker = utils.get_ticker_by_isin(ticker, proxy, session=session)
ticker = utils.get_ticker_by_isin(ticker, session=session)
shared._ISINS[ticker] = isin
_tickers_.append(ticker)
@@ -144,7 +142,11 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
shared._TRACEBACKS = {}
# Ensure data initialised with session.
YfData(session=session)
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
YfData(session=session, proxy=proxy)
else:
YfData(session=session)
# download using threads
if threads:
@@ -156,7 +158,7 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
start=start, end=end, prepost=prepost,
actions=actions, auto_adjust=auto_adjust,
back_adjust=back_adjust, repair=repair, keepna=keepna,
progress=(progress and i > 0), proxy=proxy,
progress=(progress and i > 0),
rounding=rounding, timeout=timeout)
while len(shared._DFS) < len(tickers):
_time.sleep(0.01)
@@ -167,7 +169,6 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
start=start, end=end, prepost=prepost,
actions=actions, auto_adjust=auto_adjust,
back_adjust=back_adjust, repair=repair, keepna=keepna,
proxy=proxy,
rounding=rounding, timeout=timeout)
if progress:
shared._PROGRESS_BAR.animate()
@@ -258,10 +259,10 @@ def _realign_dfs():
def _download_one_threaded(ticker, start=None, end=None,
auto_adjust=False, back_adjust=False, repair=False,
actions=False, progress=True, period="max",
interval="1d", prepost=False, proxy=None,
interval="1d", prepost=False,
keepna=False, rounding=False, timeout=10):
_download_one(ticker, start, end, auto_adjust, back_adjust, repair,
actions, period, interval, prepost, proxy, rounding,
actions, period, interval, prepost, rounding,
keepna, timeout)
if progress:
shared._PROGRESS_BAR.animate()
@@ -270,7 +271,7 @@ def _download_one_threaded(ticker, start=None, end=None,
def _download_one(ticker, start=None, end=None,
auto_adjust=False, back_adjust=False, repair=False,
actions=False, period="max", interval="1d",
prepost=False, proxy=None, rounding=False,
prepost=False, rounding=False,
keepna=False, timeout=10):
data = None
try:
@@ -278,7 +279,7 @@ def _download_one(ticker, start=None, end=None,
period=period, interval=interval,
start=start, end=end, prepost=prepost,
actions=actions, auto_adjust=auto_adjust,
back_adjust=back_adjust, repair=repair, proxy=proxy,
back_adjust=back_adjust, repair=repair,
rounding=rounding, keepna=keepna, timeout=timeout,
raise_errors=True
)

View File

@@ -3,17 +3,19 @@ import requests
from yfinance import utils
from yfinance.data import YfData
from yfinance.const import quote_summary_valid_modules
from yfinance.const import quote_summary_valid_modules, _SENTINEL_
from yfinance.scrapers.quote import _QUOTE_SUMMARY_URL_
from yfinance.exceptions import YFException
class Analysis:
def __init__(self, data: YfData, symbol: str, proxy=None):
def __init__(self, data: YfData, symbol: str, proxy=_SENTINEL_):
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
data._set_proxy(proxy)
self._data = data
self._symbol = symbol
self.proxy = proxy
# In quoteSummary the 'earningsTrend' module contains most of the data below.
# The format of data is not optimal so each function will process it's part of the data.
@@ -175,7 +177,7 @@ class Analysis:
raise YFException("No valid modules provided, see available modules using `valid_modules`")
params_dict = {"modules": modules, "corsDomain": "finance.yahoo.com", "formatted": "false", "symbol": self._symbol}
try:
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_ + f"/{self._symbol}", user_agent_headers=self._data.user_agent_headers, params=params_dict, proxy=self.proxy)
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_ + f"/{self._symbol}", user_agent_headers=self._data.user_agent_headers, params=params_dict)
except requests.exceptions.HTTPError as e:
utils.get_yf_logger().error(str(e))
return None

View File

@@ -10,10 +10,13 @@ from yfinance.exceptions import YFException, YFNotImplementedError
class Fundamentals:
def __init__(self, data: YfData, symbol: str, proxy=None):
def __init__(self, data: YfData, symbol: str, proxy=const._SENTINEL_):
if proxy is not const._SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
data._set_proxy(proxy)
self._data = data
self._symbol = symbol
self.proxy = proxy
self._earnings = None
self._financials = None
@@ -48,26 +51,38 @@ class Financials:
self._balance_sheet_time_series = {}
self._cash_flow_time_series = {}
def get_income_time_series(self, freq="yearly", proxy=None) -> pd.DataFrame:
def get_income_time_series(self, freq="yearly", proxy=const._SENTINEL_) -> pd.DataFrame:
if proxy is not const._SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
res = self._income_time_series
if freq not in res:
res[freq] = self._fetch_time_series("income", freq, proxy)
res[freq] = self._fetch_time_series("income", freq)
return res[freq]
def get_balance_sheet_time_series(self, freq="yearly", proxy=None) -> pd.DataFrame:
def get_balance_sheet_time_series(self, freq="yearly", proxy=const._SENTINEL_) -> pd.DataFrame:
if proxy is not const._SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
res = self._balance_sheet_time_series
if freq not in res:
res[freq] = self._fetch_time_series("balance-sheet", freq, proxy)
res[freq] = self._fetch_time_series("balance-sheet", freq)
return res[freq]
def get_cash_flow_time_series(self, freq="yearly", proxy=None) -> pd.DataFrame:
def get_cash_flow_time_series(self, freq="yearly", proxy=const._SENTINEL_) -> pd.DataFrame:
if proxy is not const._SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
res = self._cash_flow_time_series
if freq not in res:
res[freq] = self._fetch_time_series("cash-flow", freq, proxy)
res[freq] = self._fetch_time_series("cash-flow", freq)
return res[freq]
@utils.log_indent_decorator
def _fetch_time_series(self, name, timescale, proxy=None):
def _fetch_time_series(self, name, timescale):
# Fetching time series preferred over scraping 'QuoteSummaryStore',
# because it matches what Yahoo shows. But for some tickers returns nothing,
# despite 'QuoteSummaryStore' containing valid data.
@@ -84,7 +99,7 @@ class Financials:
" only available for cash-flow or income data.")
try:
statement = self._create_financials_table(name, timescale, proxy)
statement = self._create_financials_table(name, timescale)
if statement is not None:
return statement
@@ -92,7 +107,7 @@ class Financials:
utils.get_yf_logger().error(f"{self._symbol}: Failed to create {name} financials table for reason: {e}")
return pd.DataFrame()
def _create_financials_table(self, name, timescale, proxy):
def _create_financials_table(self, name, timescale):
if name == "income":
# Yahoo stores the 'income' table internally under 'financials' key
name = "financials"
@@ -100,11 +115,11 @@ class Financials:
keys = const.fundamentals_keys[name]
try:
return self.get_financials_time_series(timescale, keys, proxy)
return self._get_financials_time_series(timescale, keys)
except Exception:
pass
def get_financials_time_series(self, timescale, keys: list, proxy=None) -> pd.DataFrame:
def _get_financials_time_series(self, timescale, keys: list) -> pd.DataFrame:
timescale_translation = {"yearly": "annual", "quarterly": "quarterly", "trailing": "trailing"}
timescale = timescale_translation[timescale]
@@ -117,7 +132,7 @@ class Financials:
url += f"&period1={int(start_dt.timestamp())}&period2={int(end.timestamp())}"
# Step 3: fetch and reshape data
json_str = self._data.cache_get(url=url, proxy=proxy).text
json_str = self._data.cache_get(url=url).text
json_data = json.loads(json_str)
data_raw = json_data["timeseries"]["result"]
# data_raw = [v for v in data_raw if len(v) > 1] # Discard keys with no data

View File

@@ -1,7 +1,7 @@
import pandas as pd
from yfinance.data import YfData
from yfinance.const import _BASE_URL_
from yfinance.const import _BASE_URL_, _SENTINEL_
from yfinance.exceptions import YFDataException
from yfinance import utils
@@ -17,16 +17,17 @@ class FundsData:
Notes:
- fundPerformance module is not implemented as better data is queryable using history
"""
def __init__(self, data: YfData, symbol: str, proxy=None):
def __init__(self, data: YfData, symbol: str, proxy=_SENTINEL_):
"""
Args:
data (YfData): The YfData object for fetching data.
symbol (str): The symbol of the fund.
proxy (optional): Proxy settings for fetching data.
"""
self._data = data
self._symbol = symbol
self.proxy = proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
# quoteType
self._quote_type = None
@@ -165,26 +166,23 @@ class FundsData:
self._fetch_and_parse()
return self._sector_weightings
def _fetch(self, proxy):
def _fetch(self):
"""
Fetches the raw JSON data from the API.
Args:
proxy: Proxy settings for fetching data.
Returns:
dict: The raw JSON data.
"""
modules = ','.join(["quoteType", "summaryProfile", "topHoldings", "fundProfile"])
params_dict = {"modules": modules, "corsDomain": "finance.yahoo.com", "symbol": self._symbol, "formatted": "false"}
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_+self._symbol, user_agent_headers=self._data.user_agent_headers, params=params_dict, proxy=proxy)
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_+self._symbol, user_agent_headers=self._data.user_agent_headers, params=params_dict)
return result
def _fetch_and_parse(self) -> None:
"""
Fetches and parses the data from the API.
"""
result = self._fetch(self.proxy)
result = self._fetch()
try:
data = result["quoteSummary"]["result"][0]
# check quote type
@@ -334,4 +332,4 @@ class FundsData:
self._parse_raw_values(_fund_operations_cat.get("annualHoldingsTurnover", pd.NA)),
self._parse_raw_values(_fund_operations_cat.get("totalNetAssets", pd.NA))
]
}).set_index("Attributes")
}).set_index("Attributes")

View File

@@ -8,15 +8,17 @@ import time as _time
import bisect
from yfinance import shared, utils
from yfinance.const import _BASE_URL_, _PRICE_COLNAMES_
from yfinance.const import _BASE_URL_, _PRICE_COLNAMES_, _SENTINEL_
from yfinance.exceptions import YFInvalidPeriodError, YFPricesMissingError, YFTzMissingError, YFRateLimitError
class PriceHistory:
def __init__(self, data, ticker, tz, session=None, proxy=None):
def __init__(self, data, ticker, tz, session=None, proxy=_SENTINEL_):
self._data = data
self.ticker = ticker.upper()
self.tz = tz
self.proxy = proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
self.session = session
self._history_cache = {}
@@ -30,7 +32,7 @@ class PriceHistory:
def history(self, period="1mo", interval="1d",
start=None, end=None, prepost=False, actions=True,
auto_adjust=True, back_adjust=False, repair=False, keepna=False,
proxy=None, rounding=False, timeout=10,
proxy=_SENTINEL_, rounding=False, timeout=10,
raise_errors=False) -> pd.DataFrame:
"""
:Parameters:
@@ -61,8 +63,6 @@ class PriceHistory:
keepna: bool
Keep NaN rows returned by Yahoo?
Default is False
proxy: str
Optional. Proxy server URL scheme. Default is None
rounding: bool
Round values to 2 decimal places?
Optional. Default is False = precision suggested by Yahoo!
@@ -74,7 +74,10 @@ class PriceHistory:
If True, then raise errors as Exceptions instead of logging.
"""
logger = utils.get_yf_logger()
proxy = proxy or self.proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
interval_user = interval
period_user = period
@@ -175,7 +178,6 @@ class PriceHistory:
data = get_fn(
url=url,
params=params,
proxy=proxy,
timeout=timeout
)
if "Will be right back" in data.text or data is None:
@@ -249,9 +251,16 @@ class PriceHistory:
self._reconstruct_start_interval = None
return utils.empty_df()
# Select useful info from metadata
quote_type = self._history_metadata["instrumentType"]
expect_capital_gains = quote_type in ('MUTUALFUND', 'ETF')
tz_exchange = self._history_metadata["exchangeTimezoneName"]
currency = self._history_metadata["currency"]
# Process custom periods
if period and period not in self._history_metadata.get("validRanges", []):
end = int(_time.time())
end_dt = pd.Timestamp(end, unit='s').tz_localize("UTC")
start = _datetime.date.fromtimestamp(end)
start -= utils._interval_to_timedelta(period)
start -= _datetime.timedelta(days=4)
@@ -260,9 +269,8 @@ class PriceHistory:
quotes = utils.parse_quotes(data["chart"]["result"][0])
# Yahoo bug fix - it often appends latest price even if after end date
if end and not quotes.empty:
endDt = pd.to_datetime(end, unit='s')
if quotes.index[quotes.shape[0] - 1] >= endDt:
quotes = quotes.iloc[0:quotes.shape[0] - 1]
if quotes.index[-1] >= end_dt.tz_convert('UTC').tz_localize(None):
quotes = quotes.drop(quotes.index[-1])
if quotes.empty:
msg = f'{self.ticker}: yfinance received OHLC data: EMPTY'
elif len(quotes) == 1:
@@ -289,12 +297,6 @@ class PriceHistory:
except Exception:
pass
# Select useful info from metadata
quote_type = self._history_metadata["instrumentType"]
expect_capital_gains = quote_type in ('MUTUALFUND', 'ETF')
tz_exchange = self._history_metadata["exchangeTimezoneName"]
currency = self._history_metadata["currency"]
# Note: ordering is important. If you change order, run the tests!
quotes = utils.set_df_tz(quotes, params["interval"], tz_exchange)
quotes = utils.fix_Yahoo_dst_issue(quotes, params["interval"])
@@ -327,21 +329,22 @@ class PriceHistory:
capital_gains = utils.set_df_tz(capital_gains, interval, tz_exchange)
if start is not None:
if not quotes.empty:
startDt = quotes.index[0].floor('D')
start_d = quotes.index[0].floor('D')
if dividends is not None:
dividends = dividends.loc[startDt:]
dividends = dividends.loc[start_d:]
if capital_gains is not None:
capital_gains = capital_gains.loc[startDt:]
capital_gains = capital_gains.loc[start_d:]
if splits is not None:
splits = splits.loc[startDt:]
splits = splits.loc[start_d:]
if end is not None:
endDt = pd.Timestamp(end, unit='s').tz_localize(tz)
# -1 because date-slice end is inclusive
end_dt_sub1 = end_dt - pd.Timedelta(1)
if dividends is not None:
dividends = dividends[dividends.index < endDt]
dividends = dividends[:end_dt_sub1]
if capital_gains is not None:
capital_gains = capital_gains[capital_gains.index < endDt]
capital_gains = capital_gains[:end_dt_sub1]
if splits is not None:
splits = splits[splits.index < endDt]
splits = splits[:end_dt_sub1]
# Prepare for combine
intraday = params["interval"][-1] in ("m", 'h')
@@ -383,7 +386,9 @@ class PriceHistory:
msg = f'{self.ticker}: OHLC after combining events: {df.index[0]} -> {df.index[-1]}'
logger.debug(msg)
df = utils.fix_Yahoo_returning_live_separate(df, params["interval"], tz_exchange, repair=repair, currency=currency)
df, last_trade = utils.fix_Yahoo_returning_live_separate(df, params["interval"], tz_exchange, prepost, repair=repair, currency=currency)
if last_trade is not None:
self._history_metadata['lastTrade'] = {'Price':last_trade['Close'], "Time":last_trade.name}
df = df[~df.index.duplicated(keep='first')] # must do before repair
@@ -401,10 +406,11 @@ class PriceHistory:
df = self._fix_bad_div_adjust(df, interval, currency)
# Need the latest/last row to be repaired before 100x/split repair:
df_last = self._fix_zeroes(df.iloc[-1:], interval, tz_exchange, prepost)
if 'Repaired?' not in df.columns:
df['Repaired?'] = False
df = pd.concat([df.drop(df.index[-1]), df_last])
if not df.empty:
df_last = self._fix_zeroes(df.iloc[-1:], interval, tz_exchange, prepost)
if 'Repaired?' not in df.columns:
df['Repaired?'] = False
df = pd.concat([df.drop(df.index[-1]), df_last])
df = self._fix_unit_mixups(df, interval, tz_exchange, prepost)
df = self._fix_bad_stock_splits(df, interval, tz_exchange)
@@ -463,19 +469,23 @@ class PriceHistory:
self._reconstruct_start_interval = None
return df
def _get_history_cache(self, period="max", interval="1d", proxy=None) -> pd.DataFrame:
def _get_history_cache(self, period="max", interval="1d") -> pd.DataFrame:
cache_key = (interval, period)
if cache_key in self._history_cache:
return self._history_cache[cache_key]
df = self.history(period=period, interval=interval, prepost=True, proxy=proxy)
df = self.history(period=period, interval=interval, prepost=True)
self._history_cache[cache_key] = df
return df
def get_history_metadata(self, proxy=None) -> dict:
def get_history_metadata(self, proxy=_SENTINEL_) -> dict:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
if self._history_metadata is None or 'tradingPeriods' not in self._history_metadata:
# Request intraday data, because then Yahoo returns exchange schedule (tradingPeriods).
self._get_history_cache(period="5d", interval="1h", proxy=proxy)
self._get_history_cache(period="5d", interval="1h")
if self._history_metadata_formatted is False:
self._history_metadata = utils.format_history_metadata(self._history_metadata)
@@ -483,29 +493,45 @@ class PriceHistory:
return self._history_metadata
def get_dividends(self, period="max", proxy=None) -> pd.Series:
df = self._get_history_cache(period=period, proxy=proxy)
def get_dividends(self, period="max", proxy=_SENTINEL_) -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
df = self._get_history_cache(period=period)
if "Dividends" in df.columns:
dividends = df["Dividends"]
return dividends[dividends != 0]
return pd.Series()
def get_capital_gains(self, period="max", proxy=None) -> pd.Series:
df = self._get_history_cache(period=period, proxy=proxy)
def get_capital_gains(self, period="max", proxy=_SENTINEL_) -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
df = self._get_history_cache(period=period)
if "Capital Gains" in df.columns:
capital_gains = df["Capital Gains"]
return capital_gains[capital_gains != 0]
return pd.Series()
def get_splits(self, period="max", proxy=None) -> pd.Series:
df = self._get_history_cache(period=period, proxy=proxy)
def get_splits(self, period="max", proxy=_SENTINEL_) -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
df = self._get_history_cache(period=period)
if "Stock Splits" in df.columns:
splits = df["Stock Splits"]
return splits[splits != 0]
return pd.Series()
def get_actions(self, period="max", proxy=None) -> pd.Series:
df = self._get_history_cache(period=period, proxy=proxy)
def get_actions(self, period="max", proxy=_SENTINEL_) -> pd.Series:
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
df = self._get_history_cache(period=period)
action_columns = []
if "Dividends" in df.columns:

View File

@@ -3,19 +3,20 @@ import requests
from yfinance import utils
from yfinance.data import YfData
from yfinance.const import _BASE_URL_
from yfinance.const import _BASE_URL_, _SENTINEL_
from yfinance.exceptions import YFDataException
_QUOTE_SUMMARY_URL_ = f"{_BASE_URL_}/v10/finance/quoteSummary"
class Holders:
_SCRAPE_URL_ = 'https://finance.yahoo.com/quote'
def __init__(self, data: YfData, symbol: str, proxy=None):
def __init__(self, data: YfData, symbol: str, proxy=_SENTINEL_):
self._data = data
self._symbol = symbol
self.proxy = proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
data._set_proxy(proxy)
self._major = None
self._major_direct_holders = None
@@ -62,16 +63,16 @@ class Holders:
self._fetch_and_parse()
return self._insider_roster
def _fetch(self, proxy):
def _fetch(self):
modules = ','.join(
["institutionOwnership", "fundOwnership", "majorDirectHolders", "majorHoldersBreakdown", "insiderTransactions", "insiderHolders", "netSharePurchaseActivity"])
params_dict = {"modules": modules, "corsDomain": "finance.yahoo.com", "formatted": "false"}
result = self._data.get_raw_json(f"{_QUOTE_SUMMARY_URL_}/{self._symbol}", user_agent_headers=self._data.user_agent_headers, params=params_dict, proxy=proxy)
result = self._data.get_raw_json(f"{_QUOTE_SUMMARY_URL_}/{self._symbol}", user_agent_headers=self._data.user_agent_headers, params=params_dict)
return result
def _fetch_and_parse(self):
try:
result = self._fetch(self.proxy)
result = self._fetch()
except requests.exceptions.HTTPError as e:
utils.get_yf_logger().error(str(e))

View File

@@ -7,7 +7,7 @@ import requests
from yfinance import utils
from yfinance.data import YfData
from yfinance.const import quote_summary_valid_modules, _BASE_URL_, _QUERY1_URL_
from yfinance.const import quote_summary_valid_modules, _BASE_URL_, _QUERY1_URL_, _SENTINEL_
from yfinance.exceptions import YFDataException, YFException
info_retired_keys_price = {"currentPrice", "dayHigh", "dayLow", "open", "previousClose", "volume", "volume24Hr"}
@@ -26,9 +26,11 @@ _QUOTE_SUMMARY_URL_ = f"{_BASE_URL_}/v10/finance/quoteSummary"
class FastInfo:
# Contain small subset of info[] items that can be fetched faster elsewhere.
# Imitates a dict.
def __init__(self, tickerBaseObject, proxy=None):
def __init__(self, tickerBaseObject, proxy=_SENTINEL_):
self._tkr = tickerBaseObject
self.proxy = proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._tkr._data._set_proxy(proxy)
self._prices_1y = None
self._prices_1wk_1h_prepost = None
@@ -128,8 +130,8 @@ class FastInfo:
def _get_1y_prices(self, fullDaysOnly=False):
if self._prices_1y is None:
self._prices_1y = self._tkr.history(period="1y", auto_adjust=False, keepna=True, proxy=self.proxy)
self._md = self._tkr.get_history_metadata(proxy=self.proxy)
self._prices_1y = self._tkr.history(period="1y", auto_adjust=False, keepna=True)
self._md = self._tkr.get_history_metadata()
try:
ctp = self._md["currentTradingPeriod"]
self._today_open = pd.to_datetime(ctp["regular"]["start"], unit='s', utc=True).tz_convert(self.timezone)
@@ -154,12 +156,12 @@ class FastInfo:
def _get_1wk_1h_prepost_prices(self):
if self._prices_1wk_1h_prepost is None:
self._prices_1wk_1h_prepost = self._tkr.history(period="5d", interval="1h", auto_adjust=False, prepost=True, proxy=self.proxy)
self._prices_1wk_1h_prepost = self._tkr.history(period="5d", interval="1h", auto_adjust=False, prepost=True)
return self._prices_1wk_1h_prepost
def _get_1wk_1h_reg_prices(self):
if self._prices_1wk_1h_reg is None:
self._prices_1wk_1h_reg = self._tkr.history(period="5d", interval="1h", auto_adjust=False, prepost=False, proxy=self.proxy)
self._prices_1wk_1h_reg = self._tkr.history(period="5d", interval="1h", auto_adjust=False, prepost=False)
return self._prices_1wk_1h_reg
def _get_exchange_metadata(self):
@@ -167,7 +169,7 @@ class FastInfo:
return self._md
self._get_1y_prices()
self._md = self._tkr.get_history_metadata(proxy=self.proxy)
self._md = self._tkr.get_history_metadata()
return self._md
def _exchange_open_now(self):
@@ -198,7 +200,7 @@ class FastInfo:
if self._currency is not None:
return self._currency
md = self._tkr.get_history_metadata(proxy=self.proxy)
md = self._tkr.get_history_metadata()
self._currency = md["currency"]
return self._currency
@@ -207,7 +209,7 @@ class FastInfo:
if self._quote_type is not None:
return self._quote_type
md = self._tkr.get_history_metadata(proxy=self.proxy)
md = self._tkr.get_history_metadata()
self._quote_type = md["instrumentType"]
return self._quote_type
@@ -232,7 +234,7 @@ class FastInfo:
if self._shares is not None:
return self._shares
shares = self._tkr.get_shares_full(start=pd.Timestamp.utcnow().date()-pd.Timedelta(days=548), proxy=self.proxy)
shares = self._tkr.get_shares_full(start=pd.Timestamp.utcnow().date()-pd.Timedelta(days=548))
# if shares is None:
# # Requesting 18 months failed, so fallback to shares which should include last year
# shares = self._tkr.get_shares()
@@ -484,11 +486,12 @@ class FastInfo:
class Quote:
def __init__(self, data: YfData, symbol: str, proxy=None):
def __init__(self, data: YfData, symbol: str, proxy=_SENTINEL_):
self._data = data
self._symbol = symbol
self.proxy = proxy
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
self._info = None
self._retired_info = None
@@ -505,15 +508,15 @@ class Quote:
@property
def info(self) -> dict:
if self._info is None:
self._fetch_info(self.proxy)
self._fetch_complementary(self.proxy)
self._fetch_info()
self._fetch_complementary()
return self._info
@property
def sustainability(self) -> pd.DataFrame:
if self._sustainability is None:
result = self._fetch(self.proxy, modules=['esgScores'])
result = self._fetch(modules=['esgScores'])
if result is None:
self._sustainability = pd.DataFrame()
else:
@@ -527,7 +530,7 @@ class Quote:
@property
def recommendations(self) -> pd.DataFrame:
if self._recommendations is None:
result = self._fetch(self.proxy, modules=['recommendationTrend'])
result = self._fetch(modules=['recommendationTrend'])
if result is None:
self._recommendations = pd.DataFrame()
else:
@@ -541,7 +544,7 @@ class Quote:
@property
def upgrades_downgrades(self) -> pd.DataFrame:
if self._upgrades_downgrades is None:
result = self._fetch(self.proxy, modules=['upgradeDowngradeHistory'])
result = self._fetch(modules=['upgradeDowngradeHistory'])
if result is None:
self._upgrades_downgrades = pd.DataFrame()
else:
@@ -575,7 +578,7 @@ class Quote:
def valid_modules():
return quote_summary_valid_modules
def _fetch(self, proxy, modules: list):
def _fetch(self, modules: list):
if not isinstance(modules, list):
raise YFException("Should provide a list of modules, see available modules using `valid_modules`")
@@ -584,33 +587,34 @@ class Quote:
raise YFException("No valid modules provided, see available modules using `valid_modules`")
params_dict = {"modules": modules, "corsDomain": "finance.yahoo.com", "formatted": "false", "symbol": self._symbol}
try:
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_ + f"/{self._symbol}", user_agent_headers=self._data.user_agent_headers, params=params_dict, proxy=proxy)
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_ + f"/{self._symbol}", user_agent_headers=self._data.user_agent_headers, params=params_dict)
except requests.exceptions.HTTPError as e:
utils.get_yf_logger().error(str(e))
return None
return result
def _fetch_additional_info(self, proxy):
def _fetch_additional_info(self):
params_dict = {"symbols": self._symbol, "formatted": "false"}
try:
result = self._data.get_raw_json(f"{_QUERY1_URL_}/v7/finance/quote?",
user_agent_headers=self._data.user_agent_headers,
params=params_dict, proxy=proxy)
params=params_dict)
except requests.exceptions.HTTPError as e:
utils.get_yf_logger().error(str(e))
return None
return result
def _fetch_info(self, proxy):
def _fetch_info(self):
if self._already_fetched:
return
self._already_fetched = True
modules = ['financialData', 'quoteType', 'defaultKeyStatistics', 'assetProfile', 'summaryDetail']
result = self._fetch(proxy, modules=modules)
result.update(self._fetch_additional_info(proxy))
if result is None:
self._info = {}
return
result = self._fetch(modules=modules)
additional_info = self._fetch_additional_info()
if additional_info is not None and result is not None:
result.update(additional_info)
else:
result = additional_info
query1_info = {}
for quote in ["quoteSummary", "quoteResponse"]:
@@ -657,13 +661,12 @@ class Quote:
self._info = {k: _format(k, v) for k, v in query1_info.items()}
def _fetch_complementary(self, proxy):
def _fetch_complementary(self):
if self._already_fetched_complementary:
return
self._already_fetched_complementary = True
# self._scrape(proxy) # decrypt broken
self._fetch_info(proxy)
self._fetch_info()
if self._info is None:
return
@@ -702,7 +705,7 @@ class Quote:
end = int(end.timestamp())
url += f"&period1={start}&period2={end}"
json_str = self._data.cache_get(url=url, proxy=proxy).text
json_str = self._data.cache_get(url=url).text
json_data = json.loads(json_str)
json_result = json_data.get("timeseries") or json_data.get("finance")
if json_result["error"] is not None:
@@ -716,7 +719,7 @@ class Quote:
def _fetch_calendar(self):
# secFilings return too old data, so not requesting it for now
result = self._fetch(self.proxy, modules=['calendarEvents'])
result = self._fetch(modules=['calendarEvents'])
if result is None:
self._calendar = {}
return
@@ -743,7 +746,7 @@ class Quote:
def _fetch_sec_filings(self):
result = self._fetch(self.proxy, modules=['secFilings'])
result = self._fetch(modules=['secFilings'])
if result is None:
return None

View File

@@ -2,10 +2,10 @@ from .query import EquityQuery as EqyQy
from .query import FundQuery as FndQy
from .query import QueryBase, EquityQuery, FundQuery
from yfinance.const import _BASE_URL_
from yfinance.const import _BASE_URL_, _SENTINEL_
from yfinance.data import YfData
from ..utils import dynamic_docstring, generate_list_table_from_dict_universal
from ..utils import dynamic_docstring, generate_list_table_from_dict_universal, print_once
from typing import Union
import requests
@@ -58,7 +58,7 @@ def screen(query: Union[str, EquityQuery, FundQuery],
sortAsc: bool = None,
userId: str = None,
userIdType: str = None,
session = None, proxy = None):
session = None, proxy = _SENTINEL_):
"""
Run a screen: predefined query, or custom query.
@@ -106,6 +106,12 @@ def screen(query: Union[str, EquityQuery, FundQuery],
{predefined_screeners}
"""
if proxy is not _SENTINEL_:
print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
_data = YfData(session=session, proxy=proxy)
else:
_data = YfData(session=session)
# Only use defaults when user NOT give a predefined, because
# Yahoo's predefined endpoint auto-applies defaults. Also,
# that endpoint might be ignoring these fields.
@@ -121,10 +127,7 @@ def screen(query: Union[str, EquityQuery, FundQuery],
if size is not None and size > 250:
raise ValueError("Yahoo limits query size to 250, reduce size.")
fields = dict(locals())
for k in ['query', 'session', 'proxy']:
if k in fields:
del fields[k]
fields = {'offset': offset, 'size': size, 'sortField': sortField, 'sortAsc': sortAsc, 'userId': userId, 'userIdType': userIdType}
params_dict = {"corsDomain": "finance.yahoo.com", "formatted": "false", "lang": "en-US", "region": "US"}
@@ -132,12 +135,11 @@ def screen(query: Union[str, EquityQuery, FundQuery],
if isinstance(query, str):
# post_query = PREDEFINED_SCREENER_QUERIES[query]
# Switch to Yahoo's predefined endpoint
_data = YfData(session=session)
params_dict['scrIds'] = query
for k,v in fields.items():
if v is not None:
params_dict[k] = v
resp = _data.get(url=_PREDEFINED_URL_, params=params_dict, proxy=proxy)
resp = _data.get(url=_PREDEFINED_URL_, params=params_dict)
try:
resp.raise_for_status()
except requests.exceptions.HTTPError:
@@ -170,11 +172,9 @@ def screen(query: Union[str, EquityQuery, FundQuery],
post_query['query'] = post_query['query'].to_dict()
# Fetch
_data = YfData(session=session)
response = _data.post(_SCREENER_URL_,
body=post_query,
user_agent_headers=_data.user_agent_headers,
params=params_dict,
proxy=proxy)
params=params_dict)
response.raise_for_status()
return response.json()['finance']['result'][0]

View File

@@ -22,14 +22,14 @@
import json as _json
from . import utils
from .const import _BASE_URL_
from .const import _BASE_URL_, _SENTINEL_
from .data import YfData
class Search:
def __init__(self, query, max_results=8, news_count=8, lists_count=8, include_cb=True, include_nav_links=False,
include_research=False, include_cultural_assets=False, enable_fuzzy_query=False, recommended=8,
session=None, proxy=None, timeout=30, raise_errors=True):
session=None, proxy=_SENTINEL_, timeout=30, raise_errors=True):
"""
Fetches and organizes search results from Yahoo Finance, including stock quotes and news articles.
@@ -45,16 +45,20 @@ class Search:
enable_fuzzy_query: Enable fuzzy search for typos (default False).
recommended: Recommended number of results to return (default 8).
session: Custom HTTP session for requests (default None).
proxy: Proxy settings for requests (default None).
timeout: Request timeout in seconds (default 30).
raise_errors: Raise exceptions on error (default True).
"""
self.session = session
self._data = YfData(session=self.session)
if proxy is not _SENTINEL_:
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
self.query = query
self.max_results = max_results
self.enable_fuzzy_query = enable_fuzzy_query
self.news_count = news_count
self.session = session
self.proxy = proxy
self.timeout = timeout
self.raise_errors = raise_errors
@@ -65,7 +69,6 @@ class Search:
self.enable_cultural_assets = include_cultural_assets
self.recommended = recommended
self._data = YfData(session=self.session)
self._logger = utils.get_yf_logger()
self._response = {}
@@ -98,7 +101,7 @@ class Search:
self._logger.debug(f'{self.query}: Yahoo GET parameters: {str(dict(params))}')
data = self._data.cache_get(url=url, params=params, proxy=self.proxy, timeout=self.timeout)
data = self._data.cache_get(url=url, params=params, timeout=self.timeout)
if data is None or "Will be right back" in data.text:
raise RuntimeError("*** YAHOO! FINANCE IS CURRENTLY DOWN! ***\n"
"Our engineers are working quickly to resolve "

View File

@@ -27,11 +27,11 @@ from .scrapers.funds import FundsData
import pandas as _pd
from .base import TickerBase
from .const import _BASE_URL_
from .const import _BASE_URL_, _SENTINEL_
class Ticker(TickerBase):
def __init__(self, ticker, session=None, proxy=None):
def __init__(self, ticker, session=None, proxy=_SENTINEL_):
super(Ticker, self).__init__(ticker, session=session, proxy=proxy)
self._expirations = {}
self._underlying = {}
@@ -45,7 +45,7 @@ class Ticker(TickerBase):
else:
url = f"{_BASE_URL_}/v7/finance/options/{self.ticker}?date={date}"
r = self._data.get(url=url, proxy=self.proxy).json()
r = self._data.get(url=url).json()
if len(r.get('optionChain', {}).get('result', [])) > 0:
for exp in r['optionChain']['result'][0]['expirationDates']:
self._expirations[_pd.Timestamp(exp, unit='s').strftime('%Y-%m-%d')] = exp
@@ -321,4 +321,4 @@ class Ticker(TickerBase):
@property
def funds_data(self) -> FundsData:
return self.get_funds_data()
return self.get_funds_data()

View File

@@ -22,9 +22,9 @@
from __future__ import print_function
from . import Ticker, multi
# from collections import namedtuple as _namedtuple
from .utils import print_once
from .data import YfData
from .const import _SENTINEL_
class Tickers:
@@ -38,6 +38,8 @@ class Tickers:
self.symbols = [ticker.upper() for ticker in tickers]
self.tickers = {ticker: Ticker(ticker, session=session) for ticker in self.symbols}
self._data = YfData(session=session)
# self.tickers = _namedtuple(
# "Tickers", ticker_objects.keys(), rename=True
# )(*ticker_objects.values())
@@ -45,25 +47,32 @@ class Tickers:
def history(self, period="1mo", interval="1d",
start=None, end=None, prepost=False,
actions=True, auto_adjust=True, repair=False,
proxy=None,
proxy=_SENTINEL_,
threads=True, group_by='column', progress=True,
timeout=10, **kwargs):
if proxy is not _SENTINEL_:
print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
return self.download(
period, interval,
start, end, prepost,
actions, auto_adjust, repair,
proxy,
threads, group_by, progress,
timeout, **kwargs)
def download(self, period="1mo", interval="1d",
start=None, end=None, prepost=False,
actions=True, auto_adjust=True, repair=False,
proxy=None,
proxy=_SENTINEL_,
threads=True, group_by='column', progress=True,
timeout=10, **kwargs):
if proxy is not _SENTINEL_:
print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
self._data._set_proxy(proxy)
data = multi.download(self.symbols,
start=start, end=end,
actions=actions,
@@ -72,7 +81,6 @@ class Tickers:
period=period,
interval=interval,
prepost=prepost,
proxy=proxy,
group_by='ticker',
threads=threads,
progress=progress,

View File

@@ -44,7 +44,6 @@ from yfinance import const
user_agent_headers = {
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}
# From https://stackoverflow.com/a/59128615
def attributes(obj):
disallowed_names = {
@@ -186,14 +185,18 @@ def is_isin(string):
return bool(_re.match("^([A-Z]{2})([A-Z0-9]{9})([0-9])$", string))
def get_all_by_isin(isin, proxy=None, session=None):
def get_all_by_isin(isin, proxy=const._SENTINEL_, session=None):
if not (is_isin(isin)):
raise ValueError("Invalid ISIN number")
if proxy is not const._SENTINEL_:
print_once("YF deprecation warning: set proxy via new config function: yf.set_proxy(proxy)")
proxy = None
# Deferred this to prevent circular imports
from .search import Search
session = session or _requests
session = session or _requests.Session()
search = Search(query=isin, max_results=1, session=session, proxy=proxy)
# Extract the first quote and news
@@ -212,17 +215,17 @@ def get_all_by_isin(isin, proxy=None, session=None):
}
def get_ticker_by_isin(isin, proxy=None, session=None):
def get_ticker_by_isin(isin, proxy=const._SENTINEL_, session=None):
data = get_all_by_isin(isin, proxy, session)
return data.get('ticker', {}).get('symbol', '')
def get_info_by_isin(isin, proxy=None, session=None):
def get_info_by_isin(isin, proxy=const._SENTINEL_, session=None):
data = get_all_by_isin(isin, proxy, session)
return data.get('ticker', {})
def get_news_by_isin(isin, proxy=None, session=None):
def get_news_by_isin(isin, proxy=const._SENTINEL_, session=None):
data = get_all_by_isin(isin, proxy, session)
return data.get('news', {})
@@ -526,7 +529,7 @@ def parse_actions(data):
splits = None
if "events" in data:
if "dividends" in data["events"]:
if "dividends" in data["events"] and len(data["events"]['dividends']) > 0:
dividends = _pd.DataFrame(
data=list(data["events"]["dividends"].values()))
dividends.set_index("date", inplace=True)
@@ -534,7 +537,7 @@ def parse_actions(data):
dividends.sort_index(inplace=True)
dividends.columns = ["Dividends"]
if "capitalGains" in data["events"]:
if "capitalGains" in data["events"] and len(data["events"]['capitalGains']) > 0:
capital_gains = _pd.DataFrame(
data=list(data["events"]["capitalGains"].values()))
capital_gains.set_index("date", inplace=True)
@@ -542,7 +545,7 @@ def parse_actions(data):
capital_gains.sort_index(inplace=True)
capital_gains.columns = ["Capital Gains"]
if "splits" in data["events"]:
if "splits" in data["events"] and len(data["events"]['splits']) > 0:
splits = _pd.DataFrame(
data=list(data["events"]["splits"].values()))
splits.set_index("date", inplace=True)
@@ -599,15 +602,40 @@ def fix_Yahoo_returning_prepost_unrequested(quotes, interval, tradingPeriods):
return quotes
def fix_Yahoo_returning_live_separate(quotes, interval, tz_exchange, repair=False, currency=None):
def _dts_in_same_interval(dt1, dt2, interval):
# Check if second date dt2 in interval starting at dt1
if interval == '1d':
last_rows_same_interval = dt1.date() == dt2.date()
elif interval == "1wk":
last_rows_same_interval = (dt2 - dt1).days < 7
elif interval == "1mo":
last_rows_same_interval = dt1.month == dt2.month
elif interval == "3mo":
shift = (dt1.month % 3) - 1
q1 = (dt1.month - shift - 1) // 3 + 1
q2 = (dt2.month - shift - 1) // 3 + 1
year_diff = dt2.year - dt1.year
quarter_diff = q2 - q1 + 4*year_diff
last_rows_same_interval = quarter_diff == 0
else:
last_rows_same_interval = (dt2 - dt1) < _pd.Timedelta(interval)
return last_rows_same_interval
def fix_Yahoo_returning_live_separate(quotes, interval, tz_exchange, prepost, repair=False, currency=None):
# Yahoo bug fix. If market is open today then Yahoo normally returns
# todays data as a separate row from rest-of week/month interval in above row.
# Seems to depend on what exchange e.g. crypto OK.
# Fix = merge them together
n = quotes.shape[0]
if n > 1:
dt1 = quotes.index[n - 1]
dt2 = quotes.index[n - 2]
if interval[-1] not in ['m', 'h']:
prepost = False
dropped_row = None
if len(quotes) > 1:
dt1 = quotes.index[-1]
dt2 = quotes.index[-2]
if quotes.index.tz is None:
dt1 = dt1.tz_localize("UTC")
dt2 = dt2.tz_localize("UTC")
@@ -618,25 +646,23 @@ def fix_Yahoo_returning_live_separate(quotes, interval, tz_exchange, repair=Fals
# - exception is volume, *slightly* greater on final row (and matches website)
if dt1.date() == dt2.date():
# Last two rows are on same day. Drop second-to-last row
dropped_row = quotes.iloc[-2]
quotes = _pd.concat([quotes.iloc[:-2], quotes.iloc[-1:]])
else:
if interval == "1wk":
last_rows_same_interval = dt1.year == dt2.year and dt1.week == dt2.week
elif interval == "1mo":
last_rows_same_interval = dt1.month == dt2.month
elif interval == "3mo":
last_rows_same_interval = dt1.year == dt2.year and dt1.quarter == dt2.quarter
else:
last_rows_same_interval = (dt1 - dt2) < _pd.Timedelta(interval)
if last_rows_same_interval:
if _dts_in_same_interval(dt2, dt1, interval):
# Last two rows are within same interval
idx1 = quotes.index[n - 1]
idx2 = quotes.index[n - 2]
idx1 = quotes.index[-1]
idx2 = quotes.index[-2]
if idx1 == idx2:
# Yahoo returning last interval duplicated, which means
# Yahoo is not returning live data (phew!)
return quotes
return quotes, None
if prepost:
# Possibly dt1 is just start of post-market
if dt1.second == 0:
# assume post-market interval
return quotes, None
ss = quotes['Stock Splits'].iloc[-2:].replace(0,1).prod()
if repair:
@@ -659,31 +685,30 @@ def fix_Yahoo_returning_live_separate(quotes, interval, tz_exchange, repair=Fals
for c in const._PRICE_COLNAMES_:
quotes.loc[idx2, c] *= 0.01
# quotes.loc[idx2, 'Stock Splits'] = 2 # wtf? why doing this?
if _np.isnan(quotes.loc[idx2, "Open"]):
quotes.loc[idx2, "Open"] = quotes["Open"].iloc[n - 1]
quotes.loc[idx2, "Open"] = quotes["Open"].iloc[-1]
# Note: nanmax() & nanmin() ignores NaNs, but still need to check not all are NaN to avoid warnings
if not _np.isnan(quotes["High"].iloc[n - 1]):
quotes.loc[idx2, "High"] = _np.nanmax([quotes["High"].iloc[n - 1], quotes["High"].iloc[n - 2]])
if not _np.isnan(quotes["High"].iloc[-1]):
quotes.loc[idx2, "High"] = _np.nanmax([quotes["High"].iloc[-1], quotes["High"].iloc[-2]])
if "Adj High" in quotes.columns:
quotes.loc[idx2, "Adj High"] = _np.nanmax([quotes["Adj High"].iloc[n - 1], quotes["Adj High"].iloc[n - 2]])
quotes.loc[idx2, "Adj High"] = _np.nanmax([quotes["Adj High"].iloc[-1], quotes["Adj High"].iloc[-2]])
if not _np.isnan(quotes["Low"].iloc[n - 1]):
quotes.loc[idx2, "Low"] = _np.nanmin([quotes["Low"].iloc[n - 1], quotes["Low"].iloc[n - 2]])
if not _np.isnan(quotes["Low"].iloc[-1]):
quotes.loc[idx2, "Low"] = _np.nanmin([quotes["Low"].iloc[-1], quotes["Low"].iloc[-2]])
if "Adj Low" in quotes.columns:
quotes.loc[idx2, "Adj Low"] = _np.nanmin([quotes["Adj Low"].iloc[n - 1], quotes["Adj Low"].iloc[n - 2]])
quotes.loc[idx2, "Adj Low"] = _np.nanmin([quotes["Adj Low"].iloc[-1], quotes["Adj Low"].iloc[-2]])
quotes.loc[idx2, "Close"] = quotes["Close"].iloc[n - 1]
quotes.loc[idx2, "Close"] = quotes["Close"].iloc[-1]
if "Adj Close" in quotes.columns:
quotes.loc[idx2, "Adj Close"] = quotes["Adj Close"].iloc[n - 1]
quotes.loc[idx2, "Volume"] += quotes["Volume"].iloc[n - 1]
quotes.loc[idx2, "Dividends"] += quotes["Dividends"].iloc[n - 1]
quotes.loc[idx2, "Adj Close"] = quotes["Adj Close"].iloc[-1]
quotes.loc[idx2, "Volume"] += quotes["Volume"].iloc[-1]
quotes.loc[idx2, "Dividends"] += quotes["Dividends"].iloc[-1]
if ss != 1.0:
quotes.loc[idx2, "Stock Splits"] = ss
quotes = quotes.drop(quotes.index[n - 1])
dropped_row = quotes.iloc[-1]
quotes = quotes.drop(quotes.index[-1])
return quotes
return quotes, dropped_row
def safe_merge_dfs(df_main, df_sub, interval):

View File

@@ -1 +1 @@
version = "0.2.55"
version = "0.2.56"