Compare commits
49 Commits
fix/user-a
...
0.2.63
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5812aa6a69 | ||
|
|
ea810b4b35 | ||
|
|
53bce46929 | ||
|
|
0eb11c56b0 | ||
|
|
2981893f30 | ||
|
|
3250386136 | ||
|
|
aa606642c0 | ||
|
|
cc0b03efd1 | ||
|
|
39dd87080d | ||
|
|
72a2fd8955 | ||
|
|
dd62ce510e | ||
|
|
61ceb2b1a8 | ||
|
|
5d7a298239 | ||
|
|
0192d2e194 | ||
|
|
8c6eb1afeb | ||
|
|
ef60663bc2 | ||
|
|
d15cf378a1 | ||
|
|
a9282e5739 | ||
|
|
a506838c3c | ||
|
|
f716eec5fe | ||
|
|
e769570f33 | ||
|
|
3bc6bacf56 | ||
|
|
7db82f3496 | ||
|
|
4ac5cd87b3 | ||
|
|
4a91008c09 | ||
|
|
de0760eec8 | ||
|
|
55bc1bdced | ||
|
|
b509dc9551 | ||
|
|
bb6ebb4b84 | ||
|
|
d08afa21fc | ||
|
|
5bd805b3f6 | ||
|
|
54c6ac3ed7 | ||
|
|
68b7c16162 | ||
|
|
78ad990371 | ||
|
|
81d8737a25 | ||
|
|
946a84bf20 | ||
|
|
890026c862 | ||
|
|
22e4219ec7 | ||
|
|
ac9184bf18 | ||
|
|
745d554aae | ||
|
|
ec5548bd85 | ||
|
|
63b56a799a | ||
|
|
d1dde1814d | ||
|
|
4c23d339bc | ||
|
|
3b36df048b | ||
|
|
9158d3c119 | ||
|
|
75510557ea | ||
|
|
c5209cad3b | ||
|
|
b8ae8f317f |
3
.github/workflows/deploy_doc.yml
vendored
3
.github/workflows/deploy_doc.yml
vendored
@@ -3,7 +3,8 @@ name: Build and Deploy Sphinx Docs
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- dev-documented
|
||||
- main
|
||||
# - dev-documented
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
|
||||
6
.github/workflows/ruff.yml
vendored
6
.github/workflows/ruff.yml
vendored
@@ -9,5 +9,7 @@ jobs:
|
||||
ruff:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: chartboost/ruff-action@v1
|
||||
- uses: actions/checkout@v4
|
||||
- uses: astral-sh/ruff-action@v3
|
||||
with:
|
||||
args: check . --exclude yfinance/pricing_pb2.py
|
||||
|
||||
@@ -1,6 +1,40 @@
|
||||
Change Log
|
||||
===========
|
||||
|
||||
0.2.63
|
||||
------
|
||||
Fix download(ISIN) # 2531
|
||||
|
||||
0.2.62
|
||||
------
|
||||
Fix prices 'period=max' sometimes failing # 2509
|
||||
ISIN cache #2516
|
||||
Proxy:
|
||||
- fix false 'proxy deprecated' messages
|
||||
- fix ISIN + proxy #2514
|
||||
- replace print_once with warnings #2523
|
||||
Error handling:
|
||||
- detect rate-limit during crumb fetch #2491
|
||||
- replace requests.HTTPError with curl_cffi
|
||||
|
||||
0.2.61
|
||||
------
|
||||
Fix ALL type hints in websocket #2493
|
||||
|
||||
0.2.60
|
||||
------
|
||||
Fix cookie reuse, and handle DNS blocking fc.yahoo.com #2483
|
||||
Fixes for websocket:
|
||||
- relax protobuf version #2485
|
||||
- increase websockets version #2485
|
||||
- fix type hints #2488
|
||||
Fix predefined screen offset #2440
|
||||
|
||||
0.2.59
|
||||
------
|
||||
Fix the fix for rate-limit #2452
|
||||
Feature: live price data websocket #2201
|
||||
|
||||
0.2.58
|
||||
------
|
||||
Fix false rate-limit problem #2430
|
||||
|
||||
@@ -1,37 +1,23 @@
|
||||
# Contributing
|
||||
|
||||
> [!NOTE]
|
||||
> This is a brief guide to contributing to yfinance.
|
||||
> For more information See the [Developer Guide](https://ranaroussi.github.io/yfinance/development) for more information.
|
||||
yfinance relies on the community to investigate bugs and contribute code.
|
||||
|
||||
## Changes
|
||||
|
||||
The list of changes can be found in the [Changelog](https://github.com/ranaroussi/yfinance/blob/main/CHANGELOG.rst)
|
||||
|
||||
## Running a branch
|
||||
|
||||
```bash
|
||||
pip install git+ranaroussi/yfinance.git@dev # dev branch
|
||||
```
|
||||
|
||||
For more information, see the [Developer Guide](https://ranaroussi.github.io/yfinance/development/running.html).
|
||||
This is a quick short guide, full guide at https://ranaroussi.github.io/yfinance/development/index.html
|
||||
|
||||
## Branches
|
||||
|
||||
YFinance uses a two-layer branch model:
|
||||
|
||||
* **dev**: new features & some bug-fixes merged here, tested together, conflicts fixed, etc.
|
||||
* **dev**: new features & most bug-fixes merged here, tested together, conflicts fixed, etc.
|
||||
* **main**: stable branch where PIP releases are created.
|
||||
|
||||
> [!NOTE]
|
||||
> By default, branches target **main**, but most contributions should target **dev**.
|
||||
> Direct merges to **main** are allowed if:
|
||||
> * `yfinance` is massively broken
|
||||
> * Part of `yfinance` is broken, and the fix is simple and isolated
|
||||
> * Not updating the code (e.g. docs)
|
||||
## Running a branch
|
||||
|
||||
> [!NOTE]
|
||||
> For more information, see the [Developer Guide](https://ranaroussi.github.io/yfinance/development/branches.html).
|
||||
```bash
|
||||
pip install git+ranaroussi/yfinance.git@dev # <- dev branch
|
||||
```
|
||||
|
||||
https://ranaroussi.github.io/yfinance/development/running.html
|
||||
|
||||
### I'm a GitHub newbie, how do I contribute code?
|
||||
|
||||
@@ -39,39 +25,36 @@ YFinance uses a two-layer branch model:
|
||||
|
||||
2. Implement your change in your fork, ideally in a specific branch
|
||||
|
||||
3. Create a Pull Request, from your fork to this project. If addressing an Issue, link to it
|
||||
3. Create a [Pull Request](https://github.com/ranaroussi/yfinance/pulls), from your fork to this project. If addressing an Issue, link to it
|
||||
|
||||
> [!NOTE]
|
||||
> See the [Developer Guide](https://ranaroussi.github.io/yfinance/development/contributing.html) for more information.
|
||||
|
||||
### [How to download & run a GitHub version of yfinance](#Running-a-branch)
|
||||
https://ranaroussi.github.io/yfinance/development/code.html
|
||||
|
||||
## Documentation website
|
||||
|
||||
The new docs website [ranaroussi.github.io/yfinance/index.html](https://ranaroussi.github.io/yfinance/index.html) is generated automatically from code.
|
||||
The new docs website is generated automatically from code. https://ranaroussi.github.io/yfinance/index.html
|
||||
|
||||
> [!NOTE]
|
||||
> See the [Developer Guide](https://ranaroussi.github.io/yfinance/development/documentation.html) for more information
|
||||
> Including how to build and run the docs locally.
|
||||
Remember to updates docs when you change code, and check docs locally.
|
||||
|
||||
https://ranaroussi.github.io/yfinance/development/documentation.html
|
||||
|
||||
## Git tricks
|
||||
|
||||
Help keep the Git commit history and [network graph](https://github.com/ranaroussi/yfinance/network) compact:
|
||||
|
||||
* got a long descriptive commit message? `git commit -m "short sentence summary" -m "full commit message"`
|
||||
|
||||
* combine multiple commits into 1 with `git squash`
|
||||
|
||||
* `git rebase` is your friend: change base branch, or "merge in" updates
|
||||
|
||||
https://ranaroussi.github.io/yfinance/development/code.html#git-stuff
|
||||
|
||||
## Unit tests
|
||||
|
||||
Tests have been written using the built-in Python module `unittest`. Examples:
|
||||
|
||||
#### Run all tests: `python -m unittest discover -s tests`
|
||||
* Run all tests: `python -m unittest discover -s tests`
|
||||
|
||||
> [!NOTE]
|
||||
>
|
||||
> See the [Developer Guide](https://ranaroussi.github.io/yfinance/development/testing.html) for more information.
|
||||
https://ranaroussi.github.io/yfinance/development/testing.html
|
||||
|
||||
## Git stuff
|
||||
### commits
|
||||
|
||||
To keep the Git commit history and [network graph](https://github.com/ranaroussi/yfinance/network) compact please follow these two rules:
|
||||
|
||||
* For long commit messages use this: `git commit -m "short sentence summary" -m "full commit message"`
|
||||
|
||||
* `squash` tiny/negligible commits back with meaningful commits, or to combine successive related commits
|
||||
|
||||
> [!NOTE]
|
||||
> See the [Developer Guide](https://ranaroussi.github.io/yfinance/development/contributing.html#GIT-STUFF) for more information.
|
||||
> See the [Developer Guide](https://ranaroussi.github.io/yfinance/development/contributing.html#GIT-STUFF) for more information.
|
||||
|
||||
@@ -53,6 +53,8 @@ Install `yfinance` from PYPI using `pip`:
|
||||
$ pip install yfinance
|
||||
```
|
||||
|
||||
### [yfinance relies on the community to investigate bugs and contribute code. Here's how you can help.](CONTRIBUTING.md)
|
||||
|
||||
---
|
||||
|
||||

|
||||
|
||||
@@ -1,47 +1,6 @@
|
||||
Caching
|
||||
=======
|
||||
|
||||
Smarter Scraping
|
||||
----------------
|
||||
|
||||
Install the `nospam` package to cache API calls and reduce spam to Yahoo:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pip install yfinance[nospam]
|
||||
|
||||
To use a custom `requests` session, pass a `session=` argument to
|
||||
the Ticker constructor. This allows for caching calls to the API as well as a custom way to modify requests via the `User-agent` header.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
import requests_cache
|
||||
session = requests_cache.CachedSession('yfinance.cache')
|
||||
session.headers['User-agent'] = 'my-program/1.0'
|
||||
ticker = yf.Ticker('MSFT', session=session)
|
||||
|
||||
# The scraped response will be stored in the cache
|
||||
ticker.actions
|
||||
|
||||
|
||||
Combine `requests_cache` with rate-limiting to avoid triggering Yahoo's rate-limiter/blocker that can corrupt data.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from requests import Session
|
||||
from requests_cache import CacheMixin, SQLiteCache
|
||||
from requests_ratelimiter import LimiterMixin, MemoryQueueBucket
|
||||
from pyrate_limiter import Duration, RequestRate, Limiter
|
||||
class CachedLimiterSession(CacheMixin, LimiterMixin, Session):
|
||||
pass
|
||||
|
||||
session = CachedLimiterSession(
|
||||
limiter=Limiter(RequestRate(2, Duration.SECOND*5)), # max 2 requests per 5 seconds
|
||||
bucket_class=MemoryQueueBucket,
|
||||
backend=SQLiteCache("yfinance.cache"),
|
||||
)
|
||||
|
||||
|
||||
Persistent Cache
|
||||
----------------
|
||||
|
||||
|
||||
@@ -1,40 +0,0 @@
|
||||
Branches
|
||||
---------
|
||||
|
||||
To support rapid development without breaking stable versions, this project uses a two-layer branch model:
|
||||
|
||||
.. image:: assets/branches.png
|
||||
:alt: Branching Model
|
||||
|
||||
`Inspiration <https://miro.medium.com/max/700/1*2YagIpX6LuauC3ASpwHekg.png>`_
|
||||
|
||||
- **dev**: New features and some bug fixes are merged here. This branch allows collective testing, conflict resolution, and further stabilization before merging into the stable branch.
|
||||
- **main**: Stable branch where PIP releases are created.
|
||||
|
||||
By default, branches target **main**, but most contributions should target **dev**.
|
||||
|
||||
**Exceptions**:
|
||||
Direct merges to **main** are allowed if:
|
||||
|
||||
- `yfinance` is massively broken
|
||||
- Part of `yfinance` is broken, and the fix is simple and isolated
|
||||
- Not updating the code (e.g. docs)
|
||||
|
||||
Rebasing
|
||||
--------
|
||||
|
||||
If asked to move your branch from **main** to **dev**:
|
||||
|
||||
1. Ensure all relevant branches are pulled.
|
||||
2. Run:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git checkout {branch}
|
||||
git rebase --onto dev main {brach}
|
||||
git push --force-with-lease origin {branch}
|
||||
|
||||
Running a branch
|
||||
----------------
|
||||
|
||||
Please see `this page </development/running>`_.
|
||||
90
doc/source/development/code.rst
Normal file
90
doc/source/development/code.rst
Normal file
@@ -0,0 +1,90 @@
|
||||
****
|
||||
Code
|
||||
****
|
||||
|
||||
To support rapid development without breaking stable versions, this project uses a two-layer branch model:
|
||||
|
||||
.. image:: assets/branches.png
|
||||
:alt: Branching Model
|
||||
|
||||
`Inspiration <https://miro.medium.com/max/700/1*2YagIpX6LuauC3ASpwHekg.png>`_
|
||||
|
||||
- **dev**: New features and some bug fixes are merged here. This branch allows collective testing, conflict resolution, and further stabilization before merging into the stable branch.
|
||||
- **main**: Stable branch where PIP releases are created.
|
||||
|
||||
By default, branches target **main**, but most contributions should target **dev**.
|
||||
|
||||
**Exceptions**:
|
||||
Direct merges to **main** are allowed if:
|
||||
|
||||
- `yfinance` is massively broken
|
||||
- Part of `yfinance` is broken, and the fix is simple and isolated
|
||||
- Not updating the code (e.g. docs)
|
||||
|
||||
Creating your branch
|
||||
--------------------
|
||||
|
||||
1. Fork the repository on GitHub. If already forked, remember to ``Sync fork``
|
||||
|
||||
2. Clone your forked repository:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git clone https://github.com/{user}/{repo}.git
|
||||
|
||||
3. Create a new branch for your feature or bug fix, from appropriate base branch:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git checkout {base e.g. dev}
|
||||
git pull
|
||||
git checkout -b {your branch}
|
||||
|
||||
4. Make your changes, commit them, and push your branch to GitHub. To keep the commit history and `network graph <https://github.com/ranaroussi/yfinance/network>`_ compact, give your commits a very short summary then description:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git commit -m "short sentence summary" -m "full commit message"
|
||||
# Long message can be multiple lines (tip: copy-paste)
|
||||
|
||||
6. `Open a pull request on Github <https://github.com/ranaroussi/yfinance/pulls>`_.
|
||||
|
||||
Running a branch
|
||||
----------------
|
||||
|
||||
Please see `this page </development/running>`_.
|
||||
|
||||
Git stuff
|
||||
---------
|
||||
|
||||
- You might be asked to move your branch from ``main`` to ``dev``. This is a ``git rebase``. Remember to update **all** branches involved.
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
# update all branches:
|
||||
git checkout main
|
||||
git pull
|
||||
git checkout dev
|
||||
git pull
|
||||
# rebase from main to dev:
|
||||
git checkout {your branch}
|
||||
git pull
|
||||
git rebase --onto dev main {your branch}
|
||||
git push --force-with-lease origin {your branch}
|
||||
|
||||
- ``git rebase`` can also be used to update your branch with new commits from base, but without adding a commit to your branch history like git merge does. This keeps history clean and avoids future merge problems.
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git checkout {base branch e.g. dev}
|
||||
git pull
|
||||
git checkout {your branch}
|
||||
git rebase {base}
|
||||
git push --force-with-lease origin {your branch}
|
||||
|
||||
- ``git squash`` tiny or negligible commits with meaningful ones, or to combine successive related commits. `git squash guide <https://docs.gitlab.com/ee/topics/git/git_rebase.html#interactive-rebase>`_
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git rebase -i HEAD~2
|
||||
git push --force-with-lease origin {your branch}
|
||||
@@ -1,61 +0,0 @@
|
||||
********************************
|
||||
Contributing to yfinance
|
||||
********************************
|
||||
|
||||
`yfinance` relies on the community to investigate bugs and contribute code. Here's how you can help:
|
||||
|
||||
Contributing
|
||||
------------
|
||||
|
||||
1. Fork the repository on GitHub. If already forked, remember to `Sync fork`
|
||||
2. Clone your forked repository:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git clone https://github.com/{user}/{repo}.git
|
||||
|
||||
3. Create a new branch for your feature or bug fix:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git checkout -b {branch}
|
||||
|
||||
4. Make your changes, commit them, and push your branch to GitHub. To keep the commit history and `network graph <https://github.com/ranaroussi/yfinance/network>`_ compact:
|
||||
|
||||
Use short summaries for commits
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git commit -m "short summary" -m "full commit message"
|
||||
|
||||
**Squash** tiny or negligible commits with meaningful ones.
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
git rebase -i HEAD~2
|
||||
git push --force-with-lease origin {branch}
|
||||
|
||||
5. Open a pull request on the `yfinance` `Github <https://github.com/ranaroussi/yfinance/pulls>`_ page.
|
||||
|
||||
Git stuff
|
||||
---------
|
||||
|
||||
To keep the Git commit history and [network graph](https://github.com/ranaroussi/yfinance/network) compact please follow these two rules:
|
||||
|
||||
- For long commit messages use this: `git commit -m "short sentence summary" -m "full commit message"`
|
||||
|
||||
- `squash` tiny/negligible commits back with meaningful commits, or to combine successive related commits. [Guide](https://docs.gitlab.com/ee/topics/git/git_rebase.html#interactive-rebase) but basically it's:
|
||||
|
||||
.. code-block:: bash
|
||||
git rebase -i HEAD~2
|
||||
git push --force-with-lease origin {branch}
|
||||
|
||||
|
||||
### rebase
|
||||
|
||||
You might be asked to move your branch from `main` to `dev`. Make sure you have pulled **all** relevant branches then run:
|
||||
|
||||
.. code-block:: bash
|
||||
git checkout {branch}
|
||||
git rebase --onto dev main {brach}
|
||||
git push --force-with-lease origin {branch}
|
||||
@@ -1,6 +1,6 @@
|
||||
*************************************
|
||||
Contribution to the documentation
|
||||
*************************************
|
||||
*************
|
||||
Documentation
|
||||
*************
|
||||
|
||||
.. contents:: Documentation:
|
||||
:local:
|
||||
@@ -36,15 +36,18 @@ To build the documentation locally, follow these steps:
|
||||
|
||||
3. **View Documentation Locally**:
|
||||
|
||||
..code-block:: bash
|
||||
.. code-block:: bash
|
||||
|
||||
python -m http.server -d ./doc/_build/html
|
||||
|
||||
Then open "localhost:8000" in browser
|
||||
|
||||
|
||||
Building documentation on main
|
||||
------------------------------
|
||||
The documentation updates are built on merge to ``main`` branch. This is done via GitHub Actions workflow based on ``/yfinance/.github/workflows/deploy_doc.yml``.
|
||||
Publishing documentation
|
||||
------------------------
|
||||
|
||||
Merge into ``main`` branch triggers auto-generating documentation by action ``.github/workflows/deploy_doc.yml``.
|
||||
This publishes the generated HTML into branch ``documentation``.
|
||||
|
||||
1. Review the changes locally and push to ``dev``.
|
||||
|
||||
|
||||
@@ -2,12 +2,12 @@
|
||||
Development
|
||||
===========
|
||||
|
||||
yfinance relies on the community to investigate bugs and contribute code. Here's how you can help:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
contributing
|
||||
code
|
||||
running
|
||||
documentation
|
||||
reporting_bug
|
||||
branches
|
||||
testing
|
||||
running
|
||||
testing
|
||||
@@ -1,5 +0,0 @@
|
||||
********************************
|
||||
Reporting a Bug
|
||||
********************************
|
||||
|
||||
Open a new issue on our `GitHub <https://github.com/ranaroussi/yfinance/issues>`_.
|
||||
@@ -5,12 +5,14 @@ With PIP
|
||||
--------
|
||||
|
||||
.. code-block:: bash
|
||||
pip install git+https://github.com/{user}/{repo}.git@{branch}
|
||||
|
||||
pip install git+https://github.com/{user}/{repo}.git@{branch}
|
||||
|
||||
E.g.:
|
||||
|
||||
.. code-block:: bash
|
||||
pip install git+https://github.com/ranaroussi/yfinance.git@feature/name
|
||||
|
||||
pip install git+https://github.com/ranaroussi/yfinance.git@feature/name
|
||||
|
||||
With Git
|
||||
--------
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{% set name = "yfinance" %}
|
||||
{% set version = "0.2.58" %}
|
||||
{% set version = "0.2.63" %}
|
||||
|
||||
package:
|
||||
name: "{{ name|lower }}"
|
||||
|
||||
@@ -11,5 +11,5 @@ requests_cache>=1.0
|
||||
requests_ratelimiter>=0.3.1
|
||||
scipy>=1.6.3
|
||||
curl_cffi>=0.7
|
||||
protobuf>=5.29.0,<6
|
||||
websockets>=11.0
|
||||
protobuf>=3.19.0
|
||||
websockets>=13.0
|
||||
|
||||
2
setup.py
2
setup.py
@@ -64,7 +64,7 @@ setup(
|
||||
'platformdirs>=2.0.0', 'pytz>=2022.5',
|
||||
'frozendict>=2.3.4', 'peewee>=3.16.2',
|
||||
'beautifulsoup4>=4.11.1', 'curl_cffi>=0.7',
|
||||
'protobuf>=5.29.0,<6', 'websockets>=11.0'],
|
||||
'protobuf>=3.19.0', 'websockets>=13.0'],
|
||||
extras_require={
|
||||
'nospam': ['requests_cache>=1.0', 'requests_ratelimiter>=0.3.1'],
|
||||
'repair': ['scipy>=1.6.3'],
|
||||
|
||||
@@ -5,8 +5,8 @@ import datetime as _dt
|
||||
import sys
|
||||
import os
|
||||
import yfinance
|
||||
from requests_ratelimiter import LimiterSession
|
||||
from pyrate_limiter import Duration, RequestRate, Limiter
|
||||
# from requests_ratelimiter import LimiterSession
|
||||
# from pyrate_limiter import Duration, RequestRate, Limiter
|
||||
|
||||
_parent_dp = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
|
||||
_src_dp = _parent_dp
|
||||
@@ -25,12 +25,15 @@ if os.path.isdir(testing_cache_dirpath):
|
||||
import shutil
|
||||
shutil.rmtree(testing_cache_dirpath)
|
||||
|
||||
# Setup a session to only rate-limit
|
||||
history_rate = RequestRate(1, Duration.SECOND)
|
||||
limiter = Limiter(history_rate)
|
||||
session_gbl = LimiterSession(limiter=limiter)
|
||||
# Since switching to curl_cffi, the requests_ratelimiter|cache won't work.
|
||||
session_gbl = None
|
||||
|
||||
# Use this instead if you also want caching:
|
||||
# # Setup a session to only rate-limit
|
||||
# history_rate = RequestRate(1, Duration.SECOND)
|
||||
# limiter = Limiter(history_rate)
|
||||
# session_gbl = LimiterSession(limiter=limiter)
|
||||
|
||||
# # Use this instead if you also want caching:
|
||||
# from requests_cache import CacheMixin, SQLiteCache
|
||||
# from requests_ratelimiter import LimiterMixin
|
||||
# from requests import Session
|
||||
|
||||
@@ -18,9 +18,9 @@ from yfinance.exceptions import YFPricesMissingError, YFInvalidPeriodError, YFNo
|
||||
|
||||
|
||||
import unittest
|
||||
import requests_cache
|
||||
# import requests_cache
|
||||
from typing import Union, Any, get_args, _GenericAlias
|
||||
from urllib.parse import urlparse, parse_qs, urlencode, urlunparse
|
||||
# from urllib.parse import urlparse, parse_qs, urlencode, urlunparse
|
||||
|
||||
ticker_attributes = (
|
||||
("major_holders", pd.DataFrame),
|
||||
@@ -284,36 +284,37 @@ class TestTickerHistory(unittest.TestCase):
|
||||
else:
|
||||
self.assertIsInstance(data.columns, pd.MultiIndex)
|
||||
|
||||
def test_no_expensive_calls_introduced(self):
|
||||
"""
|
||||
Make sure calling history to get price data has not introduced more calls to yahoo than absolutely necessary.
|
||||
As doing other type of scraping calls than "query2.finance.yahoo.com/v8/finance/chart" to yahoo website
|
||||
will quickly trigger spam-block when doing bulk download of history data.
|
||||
"""
|
||||
symbol = "GOOGL"
|
||||
period = "1y"
|
||||
with requests_cache.CachedSession(backend="memory") as session:
|
||||
ticker = yf.Ticker(symbol, session=session)
|
||||
ticker.history(period=period)
|
||||
actual_urls_called = [r.url for r in session.cache.filter()]
|
||||
# Hopefully one day we find an equivalent "requests_cache" that works with "curl_cffi"
|
||||
# def test_no_expensive_calls_introduced(self):
|
||||
# """
|
||||
# Make sure calling history to get price data has not introduced more calls to yahoo than absolutely necessary.
|
||||
# As doing other type of scraping calls than "query2.finance.yahoo.com/v8/finance/chart" to yahoo website
|
||||
# will quickly trigger spam-block when doing bulk download of history data.
|
||||
# """
|
||||
# symbol = "GOOGL"
|
||||
# period = "1y"
|
||||
# with requests_cache.CachedSession(backend="memory") as session:
|
||||
# ticker = yf.Ticker(symbol, session=session)
|
||||
# ticker.history(period=period)
|
||||
# actual_urls_called = [r.url for r in session.cache.filter()]
|
||||
|
||||
# Remove 'crumb' argument
|
||||
for i in range(len(actual_urls_called)):
|
||||
u = actual_urls_called[i]
|
||||
parsed_url = urlparse(u)
|
||||
query_params = parse_qs(parsed_url.query)
|
||||
query_params.pop('crumb', None)
|
||||
query_params.pop('cookie', None)
|
||||
u = urlunparse(parsed_url._replace(query=urlencode(query_params, doseq=True)))
|
||||
actual_urls_called[i] = u
|
||||
actual_urls_called = tuple(actual_urls_called)
|
||||
# # Remove 'crumb' argument
|
||||
# for i in range(len(actual_urls_called)):
|
||||
# u = actual_urls_called[i]
|
||||
# parsed_url = urlparse(u)
|
||||
# query_params = parse_qs(parsed_url.query)
|
||||
# query_params.pop('crumb', None)
|
||||
# query_params.pop('cookie', None)
|
||||
# u = urlunparse(parsed_url._replace(query=urlencode(query_params, doseq=True)))
|
||||
# actual_urls_called[i] = u
|
||||
# actual_urls_called = tuple(actual_urls_called)
|
||||
|
||||
expected_urls = [
|
||||
f"https://query2.finance.yahoo.com/v8/finance/chart/{symbol}?interval=1d&range=1d", # ticker's tz
|
||||
f"https://query2.finance.yahoo.com/v8/finance/chart/{symbol}?events=div%2Csplits%2CcapitalGains&includePrePost=False&interval=1d&range={period}"
|
||||
]
|
||||
for url in actual_urls_called:
|
||||
self.assertTrue(url in expected_urls, f"Unexpected URL called: {url}")
|
||||
# expected_urls = [
|
||||
# f"https://query2.finance.yahoo.com/v8/finance/chart/{symbol}?interval=1d&range=1d", # ticker's tz
|
||||
# f"https://query2.finance.yahoo.com/v8/finance/chart/{symbol}?events=div%2Csplits%2CcapitalGains&includePrePost=False&interval=1d&range={period}"
|
||||
# ]
|
||||
# for url in actual_urls_called:
|
||||
# self.assertTrue(url in expected_urls, f"Unexpected URL called: {url}")
|
||||
|
||||
def test_dividends(self):
|
||||
data = self.ticker.dividends
|
||||
@@ -889,8 +890,6 @@ class TestTickerAnalysts(unittest.TestCase):
|
||||
data = self.ticker.upgrades_downgrades
|
||||
self.assertIsInstance(data, pd.DataFrame, "data has wrong type")
|
||||
self.assertFalse(data.empty, "data is empty")
|
||||
self.assertTrue(len(data.columns) == 4, "data has wrong number of columns")
|
||||
self.assertCountEqual(data.columns.values.tolist(), ['Firm', 'ToGrade', 'FromGrade', 'Action'], "data has wrong column names")
|
||||
self.assertIsInstance(data.index, pd.DatetimeIndex, "data has wrong index type")
|
||||
|
||||
data_cached = self.ticker.upgrades_downgrades
|
||||
@@ -1000,7 +999,6 @@ class TestTickerInfo(unittest.TestCase):
|
||||
self.assertIsInstance(data, dict, "data has wrong type")
|
||||
expected_keys = ['industry', 'currentPrice', 'exchange', 'floatShares', 'companyOfficers', 'bid']
|
||||
for k in expected_keys:
|
||||
print(k)
|
||||
self.assertIn("symbol", data.keys(), f"Did not find expected key '{k}' in info dict")
|
||||
self.assertEqual(self.symbols[0], data["symbol"], "Wrong symbol value in info dict")
|
||||
|
||||
|
||||
102
yfinance/base.py
102
yfinance/base.py
@@ -66,17 +66,22 @@ class TickerBase:
|
||||
if self.ticker == "":
|
||||
raise ValueError("Empty ticker name")
|
||||
|
||||
self._data: YfData = YfData(session=session)
|
||||
if proxy is not _SENTINEL_:
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
# accept isin as ticker
|
||||
if utils.is_isin(self.ticker):
|
||||
isin = self.ticker
|
||||
self.ticker = utils.get_ticker_by_isin(self.ticker, None, session)
|
||||
c = cache.get_isin_cache()
|
||||
self.ticker = c.lookup(isin)
|
||||
if not self.ticker:
|
||||
self.ticker = utils.get_ticker_by_isin(isin)
|
||||
if self.ticker == "":
|
||||
raise ValueError(f"Invalid ISIN number: {isin}")
|
||||
|
||||
self._data: YfData = YfData(session=session)
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
self._data._set_proxy(proxy)
|
||||
if self.ticker:
|
||||
c.store(isin, self.ticker)
|
||||
|
||||
# self._price_history = PriceHistory(self._data, self.ticker)
|
||||
self._price_history = None # lazy-load
|
||||
@@ -176,7 +181,7 @@ class TickerBase:
|
||||
Columns: period strongBuy buy hold sell strongSell
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._quote.recommendations
|
||||
@@ -186,7 +191,7 @@ class TickerBase:
|
||||
|
||||
def get_recommendations_summary(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
return self.get_recommendations(as_dict=as_dict)
|
||||
@@ -198,7 +203,7 @@ class TickerBase:
|
||||
Columns: firm toGrade fromGrade action
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._quote.upgrades_downgrades
|
||||
@@ -208,21 +213,21 @@ class TickerBase:
|
||||
|
||||
def get_calendar(self, proxy=_SENTINEL_) -> dict:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
return self._quote.calendar
|
||||
|
||||
def get_sec_filings(self, proxy=_SENTINEL_) -> dict:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
return self._quote.sec_filings
|
||||
|
||||
def get_major_holders(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._holders.major
|
||||
@@ -232,7 +237,7 @@ class TickerBase:
|
||||
|
||||
def get_institutional_holders(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._holders.institutional
|
||||
@@ -243,7 +248,7 @@ class TickerBase:
|
||||
|
||||
def get_mutualfund_holders(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._holders.mutualfund
|
||||
@@ -254,7 +259,7 @@ class TickerBase:
|
||||
|
||||
def get_insider_purchases(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._holders.insider_purchases
|
||||
@@ -265,7 +270,7 @@ class TickerBase:
|
||||
|
||||
def get_insider_transactions(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._holders.insider_transactions
|
||||
@@ -276,7 +281,7 @@ class TickerBase:
|
||||
|
||||
def get_insider_roster_holders(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._holders.insider_roster
|
||||
@@ -287,7 +292,7 @@ class TickerBase:
|
||||
|
||||
def get_info(self, proxy=_SENTINEL_) -> dict:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._quote.info
|
||||
@@ -295,21 +300,16 @@ class TickerBase:
|
||||
|
||||
def get_fast_info(self, proxy=_SENTINEL_):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
if self._fast_info is None:
|
||||
self._fast_info = FastInfo(self)
|
||||
return self._fast_info
|
||||
|
||||
@property
|
||||
def basic_info(self):
|
||||
warnings.warn("'Ticker.basic_info' is deprecated and will be removed in future, Switch to 'Ticker.fast_info'", DeprecationWarning)
|
||||
return self.fast_info
|
||||
|
||||
def get_sustainability(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._quote.sustainability
|
||||
@@ -319,7 +319,7 @@ class TickerBase:
|
||||
|
||||
def get_analyst_price_targets(self, proxy=_SENTINEL_) -> dict:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
"""
|
||||
@@ -330,7 +330,7 @@ class TickerBase:
|
||||
|
||||
def get_earnings_estimate(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
"""
|
||||
@@ -342,7 +342,7 @@ class TickerBase:
|
||||
|
||||
def get_revenue_estimate(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
"""
|
||||
@@ -354,7 +354,7 @@ class TickerBase:
|
||||
|
||||
def get_earnings_history(self, proxy=_SENTINEL_, as_dict=False):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
"""
|
||||
@@ -370,7 +370,7 @@ class TickerBase:
|
||||
Columns: current 7daysAgo 30daysAgo 60daysAgo 90daysAgo
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._analysis.eps_trend
|
||||
@@ -382,7 +382,7 @@ class TickerBase:
|
||||
Columns: upLast7days upLast30days downLast7days downLast30days
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._analysis.eps_revisions
|
||||
@@ -394,7 +394,7 @@ class TickerBase:
|
||||
Columns: stock industry sector index
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._analysis.growth_estimates
|
||||
@@ -411,7 +411,7 @@ class TickerBase:
|
||||
Default is "yearly"
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
if self._fundamentals.earnings is None:
|
||||
@@ -438,7 +438,7 @@ class TickerBase:
|
||||
Default is "yearly"
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._fundamentals.financials.get_income_time_series(freq=freq)
|
||||
@@ -452,14 +452,14 @@ class TickerBase:
|
||||
|
||||
def get_incomestmt(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
return self.get_income_stmt(proxy, as_dict, pretty, freq)
|
||||
|
||||
def get_financials(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
return self.get_income_stmt(proxy, as_dict, pretty, freq)
|
||||
@@ -478,7 +478,7 @@ class TickerBase:
|
||||
Default is "yearly"
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
|
||||
@@ -493,7 +493,7 @@ class TickerBase:
|
||||
|
||||
def get_balancesheet(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
return self.get_balance_sheet(proxy, as_dict, pretty, freq)
|
||||
@@ -512,7 +512,7 @@ class TickerBase:
|
||||
Default is "yearly"
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
|
||||
@@ -527,37 +527,37 @@ class TickerBase:
|
||||
|
||||
def get_cashflow(self, proxy=_SENTINEL_, as_dict=False, pretty=False, freq="yearly"):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
return self.get_cash_flow(proxy, as_dict, pretty, freq)
|
||||
|
||||
def get_dividends(self, proxy=_SENTINEL_, period="max") -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
return self._lazy_load_price_history().get_dividends(period=period)
|
||||
|
||||
def get_capital_gains(self, proxy=_SENTINEL_, period="max") -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
return self._lazy_load_price_history().get_capital_gains(period=period)
|
||||
|
||||
def get_splits(self, proxy=_SENTINEL_, period="max") -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
return self._lazy_load_price_history().get_splits(period=period)
|
||||
|
||||
def get_actions(self, proxy=_SENTINEL_, period="max") -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
return self._lazy_load_price_history().get_actions(period=period)
|
||||
|
||||
def get_shares(self, proxy=_SENTINEL_, as_dict=False) -> Union[pd.DataFrame, dict]:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
data = self._fundamentals.shares
|
||||
@@ -570,7 +570,7 @@ class TickerBase:
|
||||
logger = utils.get_yf_logger()
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
# Process dates
|
||||
@@ -624,7 +624,7 @@ class TickerBase:
|
||||
|
||||
def get_isin(self, proxy=_SENTINEL_) -> Optional[str]:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
# *** experimental ***
|
||||
@@ -670,7 +670,7 @@ class TickerBase:
|
||||
logger = utils.get_yf_logger()
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
tab_queryrefs = {
|
||||
@@ -722,7 +722,7 @@ class TickerBase:
|
||||
logger = utils.get_yf_logger()
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
clamped_limit = min(limit, 100) # YF caps at 100, don't go higher
|
||||
@@ -783,14 +783,14 @@ class TickerBase:
|
||||
|
||||
def get_history_metadata(self, proxy=_SENTINEL_) -> dict:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
return self._lazy_load_price_history().get_history_metadata(proxy)
|
||||
|
||||
def get_funds_data(self, proxy=_SENTINEL_) -> Optional[FundsData]:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
if not self._funds_data:
|
||||
|
||||
@@ -3,13 +3,15 @@ from threading import Lock
|
||||
import os as _os
|
||||
import platformdirs as _ad
|
||||
import atexit as _atexit
|
||||
import datetime as _datetime
|
||||
import datetime as _dt
|
||||
import pickle as _pkl
|
||||
|
||||
from .utils import get_yf_logger
|
||||
|
||||
_cache_init_lock = Lock()
|
||||
|
||||
|
||||
|
||||
# --------------
|
||||
# TimeZone cache
|
||||
# --------------
|
||||
@@ -105,7 +107,7 @@ _atexit.register(_TzDBManager.close_db)
|
||||
|
||||
|
||||
tz_db_proxy = _peewee.Proxy()
|
||||
class _KV(_peewee.Model):
|
||||
class _TZ_KV(_peewee.Model):
|
||||
key = _peewee.CharField(primary_key=True)
|
||||
value = _peewee.CharField(null=True)
|
||||
|
||||
@@ -146,11 +148,11 @@ class _TzCache:
|
||||
db.connect()
|
||||
tz_db_proxy.initialize(db)
|
||||
try:
|
||||
db.create_tables([_KV])
|
||||
db.create_tables([_TZ_KV])
|
||||
except _peewee.OperationalError as e:
|
||||
if 'WITHOUT' in str(e):
|
||||
_KV._meta.without_rowid = False
|
||||
db.create_tables([_KV])
|
||||
_TZ_KV._meta.without_rowid = False
|
||||
db.create_tables([_TZ_KV])
|
||||
else:
|
||||
raise
|
||||
self.initialised = 1 # success
|
||||
@@ -166,8 +168,8 @@ class _TzCache:
|
||||
return None
|
||||
|
||||
try:
|
||||
return _KV.get(_KV.key == key).value
|
||||
except _KV.DoesNotExist:
|
||||
return _TZ_KV.get(_TZ_KV.key == key).value
|
||||
except _TZ_KV.DoesNotExist:
|
||||
return None
|
||||
|
||||
def store(self, key, value):
|
||||
@@ -185,18 +187,18 @@ class _TzCache:
|
||||
return
|
||||
try:
|
||||
if value is None:
|
||||
q = _KV.delete().where(_KV.key == key)
|
||||
q = _TZ_KV.delete().where(_TZ_KV.key == key)
|
||||
q.execute()
|
||||
return
|
||||
with db.atomic():
|
||||
_KV.insert(key=key, value=value).execute()
|
||||
_TZ_KV.insert(key=key, value=value).execute()
|
||||
except _peewee.IntegrityError:
|
||||
# Integrity error means the key already exists. Try updating the key.
|
||||
old_value = self.lookup(key)
|
||||
if old_value != value:
|
||||
get_yf_logger().debug(f"Value for key {key} changed from {old_value} to {value}.")
|
||||
with db.atomic():
|
||||
q = _KV.update(value=value).where(_KV.key == key)
|
||||
q = _TZ_KV.update(value=value).where(_TZ_KV.key == key)
|
||||
q.execute()
|
||||
|
||||
|
||||
@@ -301,16 +303,16 @@ class ISODateTimeField(_peewee.DateTimeField):
|
||||
# because user discovered peewee allowed an invalid datetime
|
||||
# to get written.
|
||||
def db_value(self, value):
|
||||
if value and isinstance(value, _datetime.datetime):
|
||||
if value and isinstance(value, _dt.datetime):
|
||||
return value.isoformat()
|
||||
return super().db_value(value)
|
||||
def python_value(self, value):
|
||||
if value and isinstance(value, str) and 'T' in value:
|
||||
return _datetime.datetime.fromisoformat(value)
|
||||
return _dt.datetime.fromisoformat(value)
|
||||
return super().python_value(value)
|
||||
class _CookieSchema(_peewee.Model):
|
||||
strategy = _peewee.CharField(primary_key=True)
|
||||
fetch_date = ISODateTimeField(default=_datetime.datetime.now)
|
||||
fetch_date = ISODateTimeField(default=_dt.datetime.now)
|
||||
|
||||
# Which cookie type depends on strategy
|
||||
cookie_bytes = _peewee.BlobField()
|
||||
@@ -374,7 +376,7 @@ class _CookieCache:
|
||||
try:
|
||||
data = _CookieSchema.get(_CookieSchema.strategy == strategy)
|
||||
cookie = _pkl.loads(data.cookie_bytes)
|
||||
return {'cookie':cookie, 'age':_datetime.datetime.now()-data.fetch_date}
|
||||
return {'cookie':cookie, 'age':_dt.datetime.now()-data.fetch_date}
|
||||
except _CookieSchema.DoesNotExist:
|
||||
return None
|
||||
|
||||
@@ -415,6 +417,211 @@ def get_cookie_cache():
|
||||
|
||||
|
||||
|
||||
# --------------
|
||||
# ISIN cache
|
||||
# --------------
|
||||
|
||||
class _ISINCacheException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class _ISINCacheDummy:
|
||||
"""Dummy cache to use if isin cache is disabled"""
|
||||
|
||||
def lookup(self, isin):
|
||||
return None
|
||||
|
||||
def store(self, isin, tkr):
|
||||
pass
|
||||
|
||||
@property
|
||||
def tz_db(self):
|
||||
return None
|
||||
|
||||
|
||||
class _ISINCacheManager:
|
||||
_isin_cache = None
|
||||
|
||||
@classmethod
|
||||
def get_isin_cache(cls):
|
||||
if cls._isin_cache is None:
|
||||
with _cache_init_lock:
|
||||
cls._initialise()
|
||||
return cls._isin_cache
|
||||
|
||||
@classmethod
|
||||
def _initialise(cls, cache_dir=None):
|
||||
cls._isin_cache = _ISINCache()
|
||||
|
||||
|
||||
class _ISINDBManager:
|
||||
_db = None
|
||||
_cache_dir = _os.path.join(_ad.user_cache_dir(), "py-yfinance")
|
||||
|
||||
@classmethod
|
||||
def get_database(cls):
|
||||
if cls._db is None:
|
||||
cls._initialise()
|
||||
return cls._db
|
||||
|
||||
@classmethod
|
||||
def close_db(cls):
|
||||
if cls._db is not None:
|
||||
try:
|
||||
cls._db.close()
|
||||
except Exception:
|
||||
# Must discard exceptions because Python trying to quit.
|
||||
pass
|
||||
|
||||
|
||||
@classmethod
|
||||
def _initialise(cls, cache_dir=None):
|
||||
if cache_dir is not None:
|
||||
cls._cache_dir = cache_dir
|
||||
|
||||
if not _os.path.isdir(cls._cache_dir):
|
||||
try:
|
||||
_os.makedirs(cls._cache_dir)
|
||||
except OSError as err:
|
||||
raise _ISINCacheException(f"Error creating ISINCache folder: '{cls._cache_dir}' reason: {err}")
|
||||
elif not (_os.access(cls._cache_dir, _os.R_OK) and _os.access(cls._cache_dir, _os.W_OK)):
|
||||
raise _ISINCacheException(f"Cannot read and write in ISINCache folder: '{cls._cache_dir}'")
|
||||
|
||||
cls._db = _peewee.SqliteDatabase(
|
||||
_os.path.join(cls._cache_dir, 'isin-tkr.db'),
|
||||
pragmas={'journal_mode': 'wal', 'cache_size': -64}
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def set_location(cls, new_cache_dir):
|
||||
if cls._db is not None:
|
||||
cls._db.close()
|
||||
cls._db = None
|
||||
cls._cache_dir = new_cache_dir
|
||||
|
||||
@classmethod
|
||||
def get_location(cls):
|
||||
return cls._cache_dir
|
||||
|
||||
# close DB when Python exists
|
||||
_atexit.register(_ISINDBManager.close_db)
|
||||
|
||||
|
||||
isin_db_proxy = _peewee.Proxy()
|
||||
class _ISIN_KV(_peewee.Model):
|
||||
key = _peewee.CharField(primary_key=True)
|
||||
value = _peewee.CharField(null=True)
|
||||
created_at = _peewee.DateTimeField(default=_dt.datetime.now)
|
||||
|
||||
class Meta:
|
||||
database = isin_db_proxy
|
||||
without_rowid = True
|
||||
|
||||
|
||||
class _ISINCache:
|
||||
def __init__(self):
|
||||
self.initialised = -1
|
||||
self.db = None
|
||||
self.dummy = False
|
||||
|
||||
def get_db(self):
|
||||
if self.db is not None:
|
||||
return self.db
|
||||
|
||||
try:
|
||||
self.db = _ISINDBManager.get_database()
|
||||
except _ISINCacheException as err:
|
||||
get_yf_logger().info(f"Failed to create ISINCache, reason: {err}. "
|
||||
"ISINCache will not be used. "
|
||||
"Tip: You can direct cache to use a different location with 'set_isin_cache_location(mylocation)'")
|
||||
self.dummy = True
|
||||
return None
|
||||
return self.db
|
||||
|
||||
def initialise(self):
|
||||
if self.initialised != -1:
|
||||
return
|
||||
|
||||
db = self.get_db()
|
||||
if db is None:
|
||||
self.initialised = 0 # failure
|
||||
return
|
||||
|
||||
db.connect()
|
||||
isin_db_proxy.initialize(db)
|
||||
try:
|
||||
db.create_tables([_ISIN_KV])
|
||||
except _peewee.OperationalError as e:
|
||||
if 'WITHOUT' in str(e):
|
||||
_ISIN_KV._meta.without_rowid = False
|
||||
db.create_tables([_ISIN_KV])
|
||||
else:
|
||||
raise
|
||||
self.initialised = 1 # success
|
||||
|
||||
def lookup(self, key):
|
||||
if self.dummy:
|
||||
return None
|
||||
|
||||
if self.initialised == -1:
|
||||
self.initialise()
|
||||
|
||||
if self.initialised == 0: # failure
|
||||
return None
|
||||
|
||||
try:
|
||||
return _ISIN_KV.get(_ISIN_KV.key == key).value
|
||||
except _ISIN_KV.DoesNotExist:
|
||||
return None
|
||||
|
||||
def store(self, key, value):
|
||||
if self.dummy:
|
||||
return
|
||||
|
||||
if self.initialised == -1:
|
||||
self.initialise()
|
||||
|
||||
if self.initialised == 0: # failure
|
||||
return
|
||||
|
||||
db = self.get_db()
|
||||
if db is None:
|
||||
return
|
||||
try:
|
||||
if value is None:
|
||||
q = _ISIN_KV.delete().where(_ISIN_KV.key == key)
|
||||
q.execute()
|
||||
return
|
||||
|
||||
# Remove existing rows with same value that are older than 1 week
|
||||
one_week_ago = _dt.datetime.now() - _dt.timedelta(weeks=1)
|
||||
old_rows_query = _ISIN_KV.delete().where(
|
||||
(_ISIN_KV.value == value) &
|
||||
(_ISIN_KV.created_at < one_week_ago)
|
||||
)
|
||||
old_rows_query.execute()
|
||||
|
||||
with db.atomic():
|
||||
_ISIN_KV.insert(key=key, value=value).execute()
|
||||
|
||||
except _peewee.IntegrityError:
|
||||
# Integrity error means the key already exists. Try updating the key.
|
||||
old_value = self.lookup(key)
|
||||
if old_value != value:
|
||||
get_yf_logger().debug(f"Value for key {key} changed from {old_value} to {value}.")
|
||||
with db.atomic():
|
||||
q = _ISIN_KV.update(value=value, created_at=_dt.datetime.now()).where(_ISIN_KV.key == key)
|
||||
q.execute()
|
||||
|
||||
|
||||
def get_isin_cache():
|
||||
return _ISINCacheManager.get_isin_cache()
|
||||
|
||||
|
||||
# --------------
|
||||
# Utils
|
||||
# --------------
|
||||
|
||||
def set_cache_location(cache_dir: str):
|
||||
"""
|
||||
Sets the path to create the "py-yfinance" cache folder in.
|
||||
@@ -425,6 +632,7 @@ def set_cache_location(cache_dir: str):
|
||||
"""
|
||||
_TzDBManager.set_location(cache_dir)
|
||||
_CookieDBManager.set_location(cache_dir)
|
||||
_ISINDBManager.set_location(cache_dir)
|
||||
|
||||
def set_tz_cache_location(cache_dir: str):
|
||||
set_cache_location(cache_dir)
|
||||
|
||||
134
yfinance/data.py
134
yfinance/data.py
@@ -10,7 +10,7 @@ from frozendict import frozendict
|
||||
from . import utils, cache
|
||||
import threading
|
||||
|
||||
from .exceptions import YFRateLimitError
|
||||
from .exceptions import YFRateLimitError, YFDataException
|
||||
|
||||
cache_maxsize = 64
|
||||
|
||||
@@ -83,13 +83,9 @@ class YfData(metaclass=SingletonMeta):
|
||||
def _set_session(self, session):
|
||||
if session is None:
|
||||
return
|
||||
with self._cookie_lock:
|
||||
self._session = session
|
||||
if self._proxy is not None:
|
||||
self._session.proxies = self._proxy
|
||||
|
||||
try:
|
||||
self._session.cache
|
||||
session.cache
|
||||
except AttributeError:
|
||||
# Not caching
|
||||
self._session_is_caching = False
|
||||
@@ -98,8 +94,16 @@ class YfData(metaclass=SingletonMeta):
|
||||
# Can't simply use a non-caching session to fetch cookie & crumb,
|
||||
# because then the caching-session won't have cookie.
|
||||
self._session_is_caching = True
|
||||
from requests_cache import DO_NOT_CACHE
|
||||
self._expire_after = DO_NOT_CACHE
|
||||
# But since switch to curl_cffi, can't use requests_cache with it.
|
||||
raise YFDataException("request_cache sessions don't work with curl_cffi, which is necessary now for Yahoo API. Solution: stop setting session, let YF handle.")
|
||||
|
||||
if not isinstance(session, requests.session.Session):
|
||||
raise YFDataException(f"Yahoo API requires curl_cffi session not {type(session)}. Solution: stop setting session, let YF handle.")
|
||||
|
||||
with self._cookie_lock:
|
||||
self._session = session
|
||||
if self._proxy is not None:
|
||||
self._session.proxies = self._proxy
|
||||
|
||||
def _set_proxy(self, proxy=None):
|
||||
with self._cookie_lock:
|
||||
@@ -133,53 +137,80 @@ class YfData(metaclass=SingletonMeta):
|
||||
if not have_lock:
|
||||
self._cookie_lock.release()
|
||||
|
||||
def _save_session_cookies(self):
|
||||
try:
|
||||
cache.get_cookie_cache().store('csrf', self._session.cookies)
|
||||
except Exception:
|
||||
@utils.log_indent_decorator
|
||||
def _save_cookie_curlCffi(self):
|
||||
if self._session is None:
|
||||
return False
|
||||
cookies = self._session.cookies.jar._cookies
|
||||
if len(cookies) == 0:
|
||||
return False
|
||||
yh_domains = [k for k in cookies.keys() if 'yahoo' in k]
|
||||
if len(yh_domains) > 1:
|
||||
# Possible when cookie fetched with CSRF method. Discard consent cookie.
|
||||
yh_domains = [k for k in yh_domains if 'consent' not in k]
|
||||
if len(yh_domains) > 1:
|
||||
utils.get_yf_logger().debug(f'Multiple Yahoo cookies, not sure which to cache: {yh_domains}')
|
||||
return False
|
||||
if len(yh_domains) == 0:
|
||||
return False
|
||||
yh_domain = yh_domains[0]
|
||||
yh_cookie = {yh_domain: cookies[yh_domain]}
|
||||
cache.get_cookie_cache().store('curlCffi', yh_cookie)
|
||||
return True
|
||||
|
||||
def _load_session_cookies(self):
|
||||
cookie_dict = cache.get_cookie_cache().lookup('csrf')
|
||||
if cookie_dict is None:
|
||||
@utils.log_indent_decorator
|
||||
def _load_cookie_curlCffi(self):
|
||||
if self._session is None:
|
||||
return False
|
||||
# Periodically refresh, 24 hours seems fair.
|
||||
if cookie_dict['age'] > datetime.timedelta(days=1):
|
||||
cookie_dict = cache.get_cookie_cache().lookup('curlCffi')
|
||||
if cookie_dict is None or len(cookie_dict) == 0:
|
||||
return False
|
||||
self._session.cookies.update(cookie_dict['cookie'])
|
||||
utils.get_yf_logger().debug('loaded persistent cookie')
|
||||
|
||||
def _save_cookie_basic(self, cookie):
|
||||
try:
|
||||
cache.get_cookie_cache().store('basic', cookie)
|
||||
except Exception:
|
||||
cookies = cookie_dict['cookie']
|
||||
domain = list(cookies.keys())[0]
|
||||
cookie = cookies[domain]['/']['A3']
|
||||
expiry_ts = cookie.expires
|
||||
if expiry_ts > 2e9:
|
||||
# convert ms to s
|
||||
expiry_ts //= 1e3
|
||||
expiry_dt = datetime.datetime.fromtimestamp(expiry_ts, tz=datetime.timezone.utc)
|
||||
expired = expiry_dt < datetime.datetime.now(datetime.timezone.utc)
|
||||
if expired:
|
||||
utils.get_yf_logger().debug('cached cookie expired')
|
||||
return False
|
||||
self._session.cookies.jar._cookies.update(cookies)
|
||||
self._cookie = cookie
|
||||
return True
|
||||
def _load_cookie_basic(self):
|
||||
cookie_dict = cache.get_cookie_cache().lookup('basic')
|
||||
if cookie_dict is None:
|
||||
return None
|
||||
# Periodically refresh, 24 hours seems fair.
|
||||
if cookie_dict['age'] > datetime.timedelta(days=1):
|
||||
return None
|
||||
utils.get_yf_logger().debug('loaded persistent cookie')
|
||||
return cookie_dict['cookie']
|
||||
|
||||
@utils.log_indent_decorator
|
||||
def _get_cookie_basic(self, timeout=30):
|
||||
if self._cookie is not None:
|
||||
utils.get_yf_logger().debug('reusing cookie')
|
||||
return True
|
||||
elif self._load_cookie_curlCffi():
|
||||
utils.get_yf_logger().debug('reusing persistent cookie')
|
||||
return True
|
||||
|
||||
# To avoid infinite recursion, do NOT use self.get()
|
||||
# - 'allow_redirects' copied from @psychoz971 solution - does it help USA?
|
||||
self._session.get(
|
||||
url='https://fc.yahoo.com',
|
||||
timeout=timeout,
|
||||
allow_redirects=True)
|
||||
try:
|
||||
self._session.get(
|
||||
url='https://fc.yahoo.com',
|
||||
timeout=timeout,
|
||||
allow_redirects=True)
|
||||
except requests.exceptions.DNSError:
|
||||
# Possible because url on some privacy/ad blocklists
|
||||
return False
|
||||
self._save_cookie_curlCffi()
|
||||
return True
|
||||
|
||||
@utils.log_indent_decorator
|
||||
def _get_crumb_basic(self, timeout=30):
|
||||
if self._crumb is not None:
|
||||
utils.get_yf_logger().debug('reusing crumb')
|
||||
return self._crumb
|
||||
|
||||
self._get_cookie_basic()
|
||||
if not self._get_cookie_basic():
|
||||
return None
|
||||
# - 'allow_redirects' copied from @psychoz971 solution - does it help USA?
|
||||
get_args = {
|
||||
'url': "https://query1.finance.yahoo.com/v1/test/getcrumb",
|
||||
@@ -192,6 +223,10 @@ class YfData(metaclass=SingletonMeta):
|
||||
else:
|
||||
crumb_response = self._session.get(**get_args)
|
||||
self._crumb = crumb_response.text
|
||||
if crumb_response.status_code == 429 or "Too Many Requests" in self._crumb:
|
||||
utils.get_yf_logger().debug(f"Didn't receive crumb {self._crumb}")
|
||||
raise YFRateLimitError()
|
||||
|
||||
if self._crumb is None or '<html>' in self._crumb:
|
||||
utils.get_yf_logger().debug("Didn't receive crumb")
|
||||
return None
|
||||
@@ -201,16 +236,17 @@ class YfData(metaclass=SingletonMeta):
|
||||
|
||||
@utils.log_indent_decorator
|
||||
def _get_cookie_and_crumb_basic(self, timeout):
|
||||
self._get_cookie_basic(timeout)
|
||||
crumb = self._get_crumb_basic(timeout)
|
||||
return crumb
|
||||
if not self._get_cookie_basic(timeout):
|
||||
return None
|
||||
return self._get_crumb_basic(timeout)
|
||||
|
||||
@utils.log_indent_decorator
|
||||
def _get_cookie_csrf(self, timeout):
|
||||
if self._cookie is not None:
|
||||
utils.get_yf_logger().debug('reusing cookie')
|
||||
return True
|
||||
|
||||
elif self._load_session_cookies():
|
||||
elif self._load_cookie_curlCffi():
|
||||
utils.get_yf_logger().debug('reusing persistent cookie')
|
||||
self._cookie = True
|
||||
return True
|
||||
@@ -270,7 +306,7 @@ class YfData(metaclass=SingletonMeta):
|
||||
# No idea why happens, but handle nicely so can switch to other cookie method.
|
||||
utils.get_yf_logger().debug('_get_cookie_csrf() encountering requests.exceptions.ChunkedEncodingError, aborting')
|
||||
self._cookie = True
|
||||
self._save_session_cookies()
|
||||
self._save_cookie_curlCffi()
|
||||
return True
|
||||
|
||||
@utils.log_indent_decorator
|
||||
@@ -295,6 +331,10 @@ class YfData(metaclass=SingletonMeta):
|
||||
r = self._session.get(**get_args)
|
||||
self._crumb = r.text
|
||||
|
||||
if r.status_code == 429 or "Too Many Requests" in self._crumb:
|
||||
utils.get_yf_logger().debug(f"Didn't receive crumb {self._crumb}")
|
||||
raise YFRateLimitError()
|
||||
|
||||
if self._crumb is None or '<html>' in self._crumb or self._crumb == '':
|
||||
utils.get_yf_logger().debug("Didn't receive crumb")
|
||||
return None
|
||||
@@ -328,11 +368,11 @@ class YfData(metaclass=SingletonMeta):
|
||||
@utils.log_indent_decorator
|
||||
def get(self, url, params=None, timeout=30):
|
||||
return self._make_request(url, request_method = self._session.get, params=params, timeout=timeout)
|
||||
|
||||
|
||||
@utils.log_indent_decorator
|
||||
def post(self, url, body, params=None, timeout=30):
|
||||
return self._make_request(url, request_method = self._session.post, body=body, params=params, timeout=timeout)
|
||||
|
||||
|
||||
@utils.log_indent_decorator
|
||||
def _make_request(self, url, request_method, body=None, params=None, timeout=30):
|
||||
# Important: treat input arguments as immutable.
|
||||
@@ -362,7 +402,7 @@ class YfData(metaclass=SingletonMeta):
|
||||
|
||||
if body:
|
||||
request_args['json'] = body
|
||||
|
||||
|
||||
response = request_method(**request_args)
|
||||
utils.get_yf_logger().debug(f'response code={response.status_code}')
|
||||
if response.status_code >= 400:
|
||||
@@ -391,4 +431,4 @@ class YfData(metaclass=SingletonMeta):
|
||||
utils.get_yf_logger().debug(f'get_raw_json(): {url}')
|
||||
response = self.get(url, params=params, timeout=timeout)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
return response.json()
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from ..ticker import Ticker
|
||||
import pandas as _pd
|
||||
from typing import Dict, List, Optional
|
||||
import warnings
|
||||
|
||||
from ..const import _QUERY1_URL_, _SENTINEL_
|
||||
from ..data import YfData
|
||||
from ..utils import print_once
|
||||
from typing import Dict, List, Optional
|
||||
import pandas as _pd
|
||||
from ..ticker import Ticker
|
||||
|
||||
_QUERY_URL_ = f'{_QUERY1_URL_}/v1/finance'
|
||||
|
||||
@@ -26,7 +27,7 @@ class Domain(ABC):
|
||||
self.session = session
|
||||
self._data: YfData = YfData(session=session)
|
||||
if proxy is not _SENTINEL_:
|
||||
print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
self._name: Optional[str] = None
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
from __future__ import print_function
|
||||
from typing import Dict, Optional
|
||||
|
||||
import pandas as _pd
|
||||
from typing import Dict, Optional
|
||||
import warnings
|
||||
|
||||
from .. import utils
|
||||
from ..const import _SENTINEL_
|
||||
from ..data import YfData
|
||||
|
||||
from .domain import Domain, _QUERY_URL_
|
||||
from .. import utils
|
||||
from ..data import YfData
|
||||
from ..const import _SENTINEL_
|
||||
|
||||
class Industry(Domain):
|
||||
"""
|
||||
@@ -20,8 +22,9 @@ class Industry(Domain):
|
||||
session (optional): The session to use for requests.
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
YfData(session=session, proxy=proxy)
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
YfData(proxy=proxy)
|
||||
YfData(session=session)
|
||||
super(Industry, self).__init__(key, session)
|
||||
self._query_url = f'{_QUERY_URL_}/industries/{self._key}'
|
||||
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
import datetime as dt
|
||||
|
||||
from ..data import YfData
|
||||
from ..data import utils
|
||||
from ..const import _QUERY1_URL_, _SENTINEL_
|
||||
import json as _json
|
||||
import warnings
|
||||
|
||||
from ..const import _QUERY1_URL_, _SENTINEL_
|
||||
from ..data import utils, YfData
|
||||
|
||||
class Market:
|
||||
def __init__(self, market:'str', session=None, proxy=_SENTINEL_, timeout=30):
|
||||
@@ -13,7 +13,7 @@ class Market:
|
||||
|
||||
self._data = YfData(session=self.session)
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
self._logger = utils.get_yf_logger()
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
from __future__ import print_function
|
||||
from typing import Dict, Optional
|
||||
from ..utils import dynamic_docstring, generate_list_table_from_dict
|
||||
from ..const import SECTOR_INDUSTY_MAPPING, _SENTINEL_
|
||||
|
||||
import pandas as _pd
|
||||
from typing import Dict, Optional
|
||||
import warnings
|
||||
|
||||
from ..const import SECTOR_INDUSTY_MAPPING, _SENTINEL_
|
||||
from ..data import YfData
|
||||
from ..utils import dynamic_docstring, generate_list_table_from_dict, get_yf_logger
|
||||
|
||||
from .domain import Domain, _QUERY_URL_
|
||||
from .. import utils
|
||||
from ..data import YfData
|
||||
|
||||
class Sector(Domain):
|
||||
"""
|
||||
@@ -28,7 +29,7 @@ class Sector(Domain):
|
||||
Map of sector and industry
|
||||
"""
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
YfData(session=session, proxy=proxy)
|
||||
|
||||
super(Sector, self).__init__(key, session)
|
||||
@@ -147,7 +148,7 @@ class Sector(Domain):
|
||||
self._industries = self._parse_industries(data.get('industries', {}))
|
||||
|
||||
except Exception as e:
|
||||
logger = utils.get_yf_logger()
|
||||
logger = get_yf_logger()
|
||||
logger.error(f"Failed to get sector data for '{self._key}' reason: {e}")
|
||||
logger.debug("Got response: ")
|
||||
logger.debug("-------------")
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import asyncio
|
||||
import base64
|
||||
import json
|
||||
from typing import List, Optional, Callable
|
||||
from typing import List, Optional, Callable, Union
|
||||
|
||||
from websockets.sync.client import connect as sync_connect
|
||||
from websockets.asyncio.client import connect as async_connect
|
||||
@@ -84,12 +84,12 @@ class AsyncWebSocket(BaseWebSocket):
|
||||
print(f"Error in heartbeat subscription: {e}")
|
||||
break
|
||||
|
||||
async def subscribe(self, symbols: str | List[str]):
|
||||
async def subscribe(self, symbols: Union[str, List[str]]):
|
||||
"""
|
||||
Subscribe to a stock symbol or a list of stock symbols.
|
||||
|
||||
Args:
|
||||
symbols (str | List[str]): Stock symbol(s) to subscribe to.
|
||||
symbols (Union[str, List[str]]): Stock symbol(s) to subscribe to.
|
||||
"""
|
||||
await self._connect()
|
||||
|
||||
@@ -109,12 +109,12 @@ class AsyncWebSocket(BaseWebSocket):
|
||||
if self.verbose:
|
||||
print(f"Subscribed to symbols: {symbols}")
|
||||
|
||||
async def unsubscribe(self, symbols: str | List[str]):
|
||||
async def unsubscribe(self, symbols: Union[str, List[str]]):
|
||||
"""
|
||||
Unsubscribe from a stock symbol or a list of stock symbols.
|
||||
|
||||
Args:
|
||||
symbols (str | List[str]): Stock symbol(s) to unsubscribe from.
|
||||
symbols (Union[str, List[str]]): Stock symbol(s) to unsubscribe from.
|
||||
"""
|
||||
await self._connect()
|
||||
|
||||
@@ -235,12 +235,12 @@ class WebSocket(BaseWebSocket):
|
||||
self._ws = None
|
||||
raise
|
||||
|
||||
def subscribe(self, symbols: str | List[str]):
|
||||
def subscribe(self, symbols: Union[str, List[str]]):
|
||||
"""
|
||||
Subscribe to a stock symbol or a list of stock symbols.
|
||||
|
||||
Args:
|
||||
symbols (str | List[str]): Stock symbol(s) to subscribe to.
|
||||
symbols (Union[str, List[str]]): Stock symbol(s) to subscribe to.
|
||||
"""
|
||||
self._connect()
|
||||
|
||||
@@ -256,12 +256,12 @@ class WebSocket(BaseWebSocket):
|
||||
if self.verbose:
|
||||
print(f"Subscribed to symbols: {symbols}")
|
||||
|
||||
def unsubscribe(self, symbols: str | List[str]):
|
||||
def unsubscribe(self, symbols: Union[str, List[str]]):
|
||||
"""
|
||||
Unsubscribe from a stock symbol or a list of stock symbols.
|
||||
|
||||
Args:
|
||||
symbols (str | List[str]): Stock symbol(s) to unsubscribe from.
|
||||
symbols (Union[str, List[str]]): Stock symbol(s) to unsubscribe from.
|
||||
"""
|
||||
self._connect()
|
||||
|
||||
|
||||
@@ -20,8 +20,8 @@
|
||||
#
|
||||
|
||||
import json as _json
|
||||
|
||||
import pandas as pd
|
||||
import warnings
|
||||
|
||||
from . import utils
|
||||
from .const import _QUERY1_URL_, _SENTINEL_
|
||||
@@ -43,12 +43,12 @@ class Lookup:
|
||||
:param raise_errors: Raise exceptions on error (default True).
|
||||
"""
|
||||
|
||||
def __init__(self, query: str, session=None, proxy=None, timeout=30, raise_errors=True):
|
||||
def __init__(self, query: str, session=None, proxy=_SENTINEL_, timeout=30, raise_errors=True):
|
||||
self.session = session
|
||||
self._data = YfData(session=self.session)
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
self.query = query
|
||||
|
||||
@@ -25,6 +25,7 @@ import logging
|
||||
import time as _time
|
||||
import traceback
|
||||
from typing import Union
|
||||
import warnings
|
||||
|
||||
import multitasking as _multitasking
|
||||
import pandas as _pd
|
||||
@@ -93,9 +94,15 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
|
||||
logger = utils.get_yf_logger()
|
||||
session = session or requests.Session(impersonate="chrome")
|
||||
|
||||
# Ensure data initialised with session.
|
||||
if proxy is not _SENTINEL_:
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=3)
|
||||
YfData(proxy=proxy)
|
||||
YfData(session=session)
|
||||
|
||||
if auto_adjust is None:
|
||||
# Warn users that default has changed to True
|
||||
utils.print_once("YF.download() has changed argument auto_adjust default to True")
|
||||
warnings.warn("YF.download() has changed argument auto_adjust default to True", FutureWarning, stacklevel=3)
|
||||
auto_adjust = True
|
||||
|
||||
if logger.isEnabledFor(logging.DEBUG):
|
||||
@@ -127,7 +134,7 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
|
||||
for ticker in tickers:
|
||||
if utils.is_isin(ticker):
|
||||
isin = ticker
|
||||
ticker = utils.get_ticker_by_isin(ticker, session=session)
|
||||
ticker = utils.get_ticker_by_isin(ticker)
|
||||
shared._ISINS[ticker] = isin
|
||||
_tickers_.append(ticker)
|
||||
|
||||
@@ -143,13 +150,6 @@ def download(tickers, start=None, end=None, actions=False, threads=True,
|
||||
shared._ERRORS = {}
|
||||
shared._TRACEBACKS = {}
|
||||
|
||||
# Ensure data initialised with session.
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
YfData(session=session, proxy=proxy)
|
||||
else:
|
||||
YfData(session=session)
|
||||
|
||||
# download using threads
|
||||
if threads:
|
||||
if threads is True:
|
||||
|
||||
@@ -1,22 +1,12 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# NO CHECKED-IN PROTOBUF GENCODE
|
||||
# source: yfinance/pricing.proto
|
||||
# Protobuf Python Version: 5.29.0
|
||||
# source: pricing.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import runtime_version as _runtime_version
|
||||
from google.protobuf import message as _message
|
||||
from google.protobuf import reflection as _reflection
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
from google.protobuf.internal import builder as _builder
|
||||
_runtime_version.ValidateProtobufRuntimeVersion(
|
||||
_runtime_version.Domain.PUBLIC,
|
||||
5,
|
||||
29,
|
||||
0,
|
||||
'',
|
||||
'yfinance/pricing.proto'
|
||||
)
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
@@ -24,13 +14,20 @@ _sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x16yfinance/pricing.proto\"\x9a\x05\n\x0bPricingData\x12\n\n\x02id\x18\x01 \x01(\t\x12\r\n\x05price\x18\x02 \x01(\x02\x12\x0c\n\x04time\x18\x03 \x01(\x12\x12\x10\n\x08\x63urrency\x18\x04 \x01(\t\x12\x10\n\x08\x65xchange\x18\x05 \x01(\t\x12\x12\n\nquote_type\x18\x06 \x01(\x05\x12\x14\n\x0cmarket_hours\x18\x07 \x01(\x05\x12\x16\n\x0e\x63hange_percent\x18\x08 \x01(\x02\x12\x12\n\nday_volume\x18\t \x01(\x12\x12\x10\n\x08\x64\x61y_high\x18\n \x01(\x02\x12\x0f\n\x07\x64\x61y_low\x18\x0b \x01(\x02\x12\x0e\n\x06\x63hange\x18\x0c \x01(\x02\x12\x12\n\nshort_name\x18\r \x01(\t\x12\x13\n\x0b\x65xpire_date\x18\x0e \x01(\x12\x12\x12\n\nopen_price\x18\x0f \x01(\x02\x12\x16\n\x0eprevious_close\x18\x10 \x01(\x02\x12\x14\n\x0cstrike_price\x18\x11 \x01(\x02\x12\x19\n\x11underlying_symbol\x18\x12 \x01(\t\x12\x15\n\ropen_interest\x18\x13 \x01(\x12\x12\x14\n\x0coptions_type\x18\x14 \x01(\x12\x12\x13\n\x0bmini_option\x18\x15 \x01(\x12\x12\x11\n\tlast_size\x18\x16 \x01(\x12\x12\x0b\n\x03\x62id\x18\x17 \x01(\x02\x12\x10\n\x08\x62id_size\x18\x18 \x01(\x12\x12\x0b\n\x03\x61sk\x18\x19 \x01(\x02\x12\x10\n\x08\x61sk_size\x18\x1a \x01(\x12\x12\x12\n\nprice_hint\x18\x1b \x01(\x12\x12\x10\n\x08vol_24hr\x18\x1c \x01(\x12\x12\x1a\n\x12vol_all_currencies\x18\x1d \x01(\x12\x12\x15\n\rfrom_currency\x18\x1e \x01(\t\x12\x13\n\x0blast_market\x18\x1f \x01(\t\x12\x1a\n\x12\x63irculating_supply\x18 \x01(\x01\x12\x12\n\nmarket_cap\x18! \x01(\x01\x62\x06proto3')
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\rpricing.proto\"\x9a\x05\n\x0bPricingData\x12\n\n\x02id\x18\x01 \x01(\t\x12\r\n\x05price\x18\x02 \x01(\x02\x12\x0c\n\x04time\x18\x03 \x01(\x12\x12\x10\n\x08\x63urrency\x18\x04 \x01(\t\x12\x10\n\x08\x65xchange\x18\x05 \x01(\t\x12\x12\n\nquote_type\x18\x06 \x01(\x05\x12\x14\n\x0cmarket_hours\x18\x07 \x01(\x05\x12\x16\n\x0e\x63hange_percent\x18\x08 \x01(\x02\x12\x12\n\nday_volume\x18\t \x01(\x12\x12\x10\n\x08\x64\x61y_high\x18\n \x01(\x02\x12\x0f\n\x07\x64\x61y_low\x18\x0b \x01(\x02\x12\x0e\n\x06\x63hange\x18\x0c \x01(\x02\x12\x12\n\nshort_name\x18\r \x01(\t\x12\x13\n\x0b\x65xpire_date\x18\x0e \x01(\x12\x12\x12\n\nopen_price\x18\x0f \x01(\x02\x12\x16\n\x0eprevious_close\x18\x10 \x01(\x02\x12\x14\n\x0cstrike_price\x18\x11 \x01(\x02\x12\x19\n\x11underlying_symbol\x18\x12 \x01(\t\x12\x15\n\ropen_interest\x18\x13 \x01(\x12\x12\x14\n\x0coptions_type\x18\x14 \x01(\x12\x12\x13\n\x0bmini_option\x18\x15 \x01(\x12\x12\x11\n\tlast_size\x18\x16 \x01(\x12\x12\x0b\n\x03\x62id\x18\x17 \x01(\x02\x12\x10\n\x08\x62id_size\x18\x18 \x01(\x12\x12\x0b\n\x03\x61sk\x18\x19 \x01(\x02\x12\x10\n\x08\x61sk_size\x18\x1a \x01(\x12\x12\x12\n\nprice_hint\x18\x1b \x01(\x12\x12\x10\n\x08vol_24hr\x18\x1c \x01(\x12\x12\x1a\n\x12vol_all_currencies\x18\x1d \x01(\x12\x12\x15\n\rfrom_currency\x18\x1e \x01(\t\x12\x13\n\x0blast_market\x18\x1f \x01(\t\x12\x1a\n\x12\x63irculating_supply\x18 \x01(\x01\x12\x12\n\nmarket_cap\x18! \x01(\x01\x62\x06proto3')
|
||||
|
||||
|
||||
|
||||
_PRICINGDATA = DESCRIPTOR.message_types_by_name['PricingData']
|
||||
PricingData = _reflection.GeneratedProtocolMessageType('PricingData', (_message.Message,), {
|
||||
'DESCRIPTOR' : _PRICINGDATA,
|
||||
'__module__' : 'pricing_pb2'
|
||||
# @@protoc_insertion_point(class_scope:PricingData)
|
||||
})
|
||||
_sym_db.RegisterMessage(PricingData)
|
||||
|
||||
_globals = globals()
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'yfinance.pricing_pb2', _globals)
|
||||
if not _descriptor._USE_C_DESCRIPTORS:
|
||||
DESCRIPTOR._loaded_options = None
|
||||
_globals['_PRICINGDATA']._serialized_start=27
|
||||
_globals['_PRICINGDATA']._serialized_end=693
|
||||
DESCRIPTOR._options = None
|
||||
_PRICINGDATA._serialized_start=18
|
||||
_PRICINGDATA._serialized_end=684
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
|
||||
@@ -1,17 +1,18 @@
|
||||
import curl_cffi
|
||||
import pandas as pd
|
||||
import requests
|
||||
import warnings
|
||||
|
||||
from yfinance import utils
|
||||
from yfinance.data import YfData
|
||||
from yfinance.const import quote_summary_valid_modules, _SENTINEL_
|
||||
from yfinance.scrapers.quote import _QUOTE_SUMMARY_URL_
|
||||
from yfinance.data import YfData
|
||||
from yfinance.exceptions import YFException
|
||||
from yfinance.scrapers.quote import _QUOTE_SUMMARY_URL_
|
||||
|
||||
class Analysis:
|
||||
|
||||
def __init__(self, data: YfData, symbol: str, proxy=_SENTINEL_):
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
data._set_proxy(proxy)
|
||||
|
||||
self._data = data
|
||||
@@ -178,7 +179,7 @@ class Analysis:
|
||||
params_dict = {"modules": modules, "corsDomain": "finance.yahoo.com", "formatted": "false", "symbol": self._symbol}
|
||||
try:
|
||||
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_ + f"/{self._symbol}", params=params_dict)
|
||||
except requests.exceptions.HTTPError as e:
|
||||
except curl_cffi.requests.exceptions.HTTPError as e:
|
||||
utils.get_yf_logger().error(str(e))
|
||||
return None
|
||||
return result
|
||||
|
||||
@@ -12,7 +12,7 @@ class Fundamentals:
|
||||
|
||||
def __init__(self, data: YfData, symbol: str, proxy=const._SENTINEL_):
|
||||
if proxy is not const._SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
data._set_proxy(proxy)
|
||||
|
||||
self._data = data
|
||||
@@ -53,7 +53,7 @@ class Financials:
|
||||
|
||||
def get_income_time_series(self, freq="yearly", proxy=const._SENTINEL_) -> pd.DataFrame:
|
||||
if proxy is not const._SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
res = self._income_time_series
|
||||
@@ -63,7 +63,7 @@ class Financials:
|
||||
|
||||
def get_balance_sheet_time_series(self, freq="yearly", proxy=const._SENTINEL_) -> pd.DataFrame:
|
||||
if proxy is not const._SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
res = self._balance_sheet_time_series
|
||||
@@ -73,7 +73,7 @@ class Financials:
|
||||
|
||||
def get_cash_flow_time_series(self, freq="yearly", proxy=const._SENTINEL_) -> pd.DataFrame:
|
||||
if proxy is not const._SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
res = self._cash_flow_time_series
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import pandas as pd
|
||||
|
||||
from yfinance.data import YfData
|
||||
from yfinance.const import _BASE_URL_, _SENTINEL_
|
||||
from yfinance.exceptions import YFDataException
|
||||
from yfinance import utils
|
||||
|
||||
from typing import Dict, Optional
|
||||
import warnings
|
||||
|
||||
from yfinance import utils
|
||||
from yfinance.const import _BASE_URL_, _SENTINEL_
|
||||
from yfinance.data import YfData
|
||||
from yfinance.exceptions import YFDataException
|
||||
|
||||
_QUOTE_SUMMARY_URL_ = f"{_BASE_URL_}/v10/finance/quoteSummary/"
|
||||
|
||||
@@ -26,7 +26,7 @@ class FundsData:
|
||||
self._data = data
|
||||
self._symbol = symbol
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
# quoteType
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
from curl_cffi import requests
|
||||
from math import isclose
|
||||
import bisect
|
||||
import datetime as _datetime
|
||||
import dateutil as _dateutil
|
||||
import logging
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
from math import isclose
|
||||
import time as _time
|
||||
import bisect
|
||||
from curl_cffi import requests
|
||||
import warnings
|
||||
|
||||
from yfinance import shared, utils
|
||||
from yfinance.const import _BASE_URL_, _PRICE_COLNAMES_, _SENTINEL_
|
||||
@@ -18,7 +19,7 @@ class PriceHistory:
|
||||
self.ticker = ticker.upper()
|
||||
self.tz = tz
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=5)
|
||||
self._data._set_proxy(proxy)
|
||||
self.session = session or requests.Session(impersonate="chrome")
|
||||
|
||||
@@ -77,7 +78,7 @@ class PriceHistory:
|
||||
logger = utils.get_yf_logger()
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=5)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
interval_user = interval
|
||||
@@ -133,13 +134,14 @@ class PriceHistory:
|
||||
end = utils._parse_user_dt(end, tz)
|
||||
if start is None:
|
||||
if interval == "1m":
|
||||
start = end - 604800 # 7 days
|
||||
elif interval in ("5m", "15m", "30m", "90m"):
|
||||
start = end - 691200 # 8 days
|
||||
elif interval in ("2m", "5m", "15m", "30m", "90m"):
|
||||
start = end - 5184000 # 60 days
|
||||
elif interval in ("1h", '60m'):
|
||||
elif interval in ("1h", "60m"):
|
||||
start = end - 63072000 # 730 days
|
||||
else:
|
||||
start = end - 3122064000 # 99 years
|
||||
start += 5 # allow for processing time
|
||||
else:
|
||||
start = utils._parse_user_dt(start, tz)
|
||||
params = {"period1": start, "period2": end}
|
||||
@@ -481,7 +483,7 @@ class PriceHistory:
|
||||
|
||||
def get_history_metadata(self, proxy=_SENTINEL_) -> dict:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=3)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
if self._history_metadata is None or 'tradingPeriods' not in self._history_metadata:
|
||||
@@ -496,7 +498,7 @@ class PriceHistory:
|
||||
|
||||
def get_dividends(self, period="max", proxy=_SENTINEL_) -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=3)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
df = self._get_history_cache(period=period)
|
||||
@@ -507,7 +509,7 @@ class PriceHistory:
|
||||
|
||||
def get_capital_gains(self, period="max", proxy=_SENTINEL_) -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=3)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
df = self._get_history_cache(period=period)
|
||||
@@ -518,7 +520,7 @@ class PriceHistory:
|
||||
|
||||
def get_splits(self, period="max", proxy=_SENTINEL_) -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=3)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
df = self._get_history_cache(period=period)
|
||||
@@ -529,7 +531,7 @@ class PriceHistory:
|
||||
|
||||
def get_actions(self, period="max", proxy=_SENTINEL_) -> pd.Series:
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=3)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
df = self._get_history_cache(period=period)
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
import curl_cffi
|
||||
import pandas as pd
|
||||
import requests
|
||||
import warnings
|
||||
|
||||
from yfinance import utils
|
||||
from yfinance.data import YfData
|
||||
from yfinance.const import _BASE_URL_, _SENTINEL_
|
||||
from yfinance.data import YfData
|
||||
from yfinance.exceptions import YFDataException
|
||||
|
||||
_QUOTE_SUMMARY_URL_ = f"{_BASE_URL_}/v10/finance/quoteSummary"
|
||||
@@ -15,7 +16,7 @@ class Holders:
|
||||
self._data = data
|
||||
self._symbol = symbol
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
data._set_proxy(proxy)
|
||||
|
||||
self._major = None
|
||||
@@ -73,7 +74,7 @@ class Holders:
|
||||
def _fetch_and_parse(self):
|
||||
try:
|
||||
result = self._fetch()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
except curl_cffi.requests.exceptions.HTTPError as e:
|
||||
utils.get_yf_logger().error(str(e))
|
||||
|
||||
self._major = pd.DataFrame()
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
import curl_cffi
|
||||
import datetime
|
||||
import json
|
||||
|
||||
import numpy as _np
|
||||
import pandas as pd
|
||||
import requests
|
||||
import warnings
|
||||
|
||||
from yfinance import utils
|
||||
from yfinance.data import YfData
|
||||
from yfinance.const import quote_summary_valid_modules, _BASE_URL_, _QUERY1_URL_, _SENTINEL_
|
||||
from yfinance.data import YfData
|
||||
from yfinance.exceptions import YFDataException, YFException
|
||||
|
||||
info_retired_keys_price = {"currentPrice", "dayHigh", "dayLow", "open", "previousClose", "volume", "volume24Hr"}
|
||||
@@ -29,7 +29,7 @@ class FastInfo:
|
||||
def __init__(self, tickerBaseObject, proxy=_SENTINEL_):
|
||||
self._tkr = tickerBaseObject
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._tkr._data._set_proxy(proxy)
|
||||
|
||||
self._prices_1y = None
|
||||
@@ -490,7 +490,7 @@ class Quote:
|
||||
self._data = data
|
||||
self._symbol = symbol
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
self._info = None
|
||||
@@ -588,7 +588,7 @@ class Quote:
|
||||
params_dict = {"modules": modules, "corsDomain": "finance.yahoo.com", "formatted": "false", "symbol": self._symbol}
|
||||
try:
|
||||
result = self._data.get_raw_json(_QUOTE_SUMMARY_URL_ + f"/{self._symbol}", params=params_dict)
|
||||
except requests.exceptions.HTTPError as e:
|
||||
except curl_cffi.requests.exceptions.HTTPError as e:
|
||||
utils.get_yf_logger().error(str(e))
|
||||
return None
|
||||
return result
|
||||
@@ -597,7 +597,7 @@ class Quote:
|
||||
params_dict = {"symbols": self._symbol, "formatted": "false"}
|
||||
try:
|
||||
result = self._data.get_raw_json(f"{_QUERY1_URL_}/v7/finance/quote?", params=params_dict)
|
||||
except requests.exceptions.HTTPError as e:
|
||||
except curl_cffi.requests.exceptions.HTTPError as e:
|
||||
utils.get_yf_logger().error(str(e))
|
||||
return None
|
||||
return result
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
from .query import EquityQuery as EqyQy
|
||||
from .query import FundQuery as FndQy
|
||||
from .query import QueryBase, EquityQuery, FundQuery
|
||||
import curl_cffi
|
||||
from typing import Union
|
||||
import warnings
|
||||
|
||||
from yfinance.const import _QUERY1_URL_, _SENTINEL_
|
||||
from yfinance.data import YfData
|
||||
from ..utils import dynamic_docstring, generate_list_table_from_dict_universal
|
||||
|
||||
from ..utils import dynamic_docstring, generate_list_table_from_dict_universal, print_once
|
||||
|
||||
from typing import Union
|
||||
import requests
|
||||
from .query import EquityQuery as EqyQy
|
||||
from .query import FundQuery as FndQy
|
||||
from .query import QueryBase, EquityQuery, FundQuery
|
||||
|
||||
_SCREENER_URL_ = f"{_QUERY1_URL_}/v1/finance/screener"
|
||||
_PREDEFINED_URL_ = f"{_SCREENER_URL_}/predefined/saved"
|
||||
@@ -112,7 +112,7 @@ def screen(query: Union[str, EquityQuery, FundQuery],
|
||||
"""
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
_data = YfData(session=session, proxy=proxy)
|
||||
else:
|
||||
_data = YfData(session=session)
|
||||
@@ -135,6 +135,18 @@ def screen(query: Union[str, EquityQuery, FundQuery],
|
||||
if size is not None and size > 250:
|
||||
raise ValueError("Yahoo limits query size to 250, reduce size.")
|
||||
|
||||
if offset is not None and isinstance(query, str):
|
||||
# offset ignored by predefined API so switch to other API
|
||||
post_query = PREDEFINED_SCREENER_QUERIES[query]
|
||||
query = post_query['query']
|
||||
# use predefined's attributes if user not specified
|
||||
if sortField is None:
|
||||
sortField = post_query['sortField']
|
||||
if sortAsc is None:
|
||||
sortAsc = post_query['sortType'].lower() == 'asc'
|
||||
# and don't use defaults
|
||||
defaults = {}
|
||||
|
||||
fields = {'offset': offset, 'count': count, "size": size, 'sortField': sortField, 'sortAsc': sortAsc, 'userId': userId, 'userIdType': userIdType}
|
||||
|
||||
params_dict = {"corsDomain": "finance.yahoo.com", "formatted": "false", "lang": "en-US", "region": "US"}
|
||||
@@ -145,7 +157,7 @@ def screen(query: Union[str, EquityQuery, FundQuery],
|
||||
# Switch to Yahoo's predefined endpoint
|
||||
|
||||
if size is not None:
|
||||
print_once("YF deprecation warning: 'size' argument is deprecated for predefined screens, set 'count' instead.")
|
||||
warnings.warn("Screen 'size' argument is deprecated for predefined screens, set 'count' instead.", DeprecationWarning, stacklevel=2)
|
||||
count = size
|
||||
size = None
|
||||
fields['count'] = fields['size']
|
||||
@@ -158,7 +170,7 @@ def screen(query: Union[str, EquityQuery, FundQuery],
|
||||
resp = _data.get(url=_PREDEFINED_URL_, params=params_dict)
|
||||
try:
|
||||
resp.raise_for_status()
|
||||
except requests.exceptions.HTTPError:
|
||||
except curl_cffi.requests.exceptions.HTTPError:
|
||||
if query not in PREDEFINED_SCREENER_QUERIES:
|
||||
print(f"yfinance.screen: '{query}' is probably not a predefined query.")
|
||||
raise
|
||||
|
||||
@@ -20,6 +20,7 @@
|
||||
#
|
||||
|
||||
import json as _json
|
||||
import warnings
|
||||
|
||||
from . import utils
|
||||
from .const import _BASE_URL_, _SENTINEL_
|
||||
@@ -52,7 +53,7 @@ class Search:
|
||||
self._data = YfData(session=self.session)
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
utils.print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
|
||||
self.query = query
|
||||
|
||||
@@ -22,17 +22,21 @@
|
||||
from __future__ import print_function
|
||||
|
||||
from collections import namedtuple as _namedtuple
|
||||
from .scrapers.funds import FundsData
|
||||
import warnings
|
||||
|
||||
import pandas as _pd
|
||||
|
||||
from .base import TickerBase
|
||||
from .const import _BASE_URL_, _SENTINEL_
|
||||
from .scrapers.funds import FundsData
|
||||
|
||||
|
||||
class Ticker(TickerBase):
|
||||
def __init__(self, ticker, session=None, proxy=_SENTINEL_):
|
||||
super(Ticker, self).__init__(ticker, session=session, proxy=proxy)
|
||||
if proxy is not _SENTINEL_:
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
super(Ticker, self).__init__(ticker, session=session)
|
||||
self._expirations = {}
|
||||
self._underlying = {}
|
||||
|
||||
|
||||
@@ -21,9 +21,10 @@
|
||||
|
||||
from __future__ import print_function
|
||||
|
||||
import warnings
|
||||
|
||||
from . import Ticker, multi
|
||||
from .live import WebSocket
|
||||
from .utils import print_once
|
||||
from .data import YfData
|
||||
from .const import _SENTINEL_
|
||||
|
||||
@@ -56,8 +57,9 @@ class Tickers:
|
||||
timeout=10, **kwargs):
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
proxy = _SENTINEL_
|
||||
|
||||
return self.download(
|
||||
period, interval,
|
||||
@@ -75,8 +77,9 @@ class Tickers:
|
||||
timeout=10, **kwargs):
|
||||
|
||||
if proxy is not _SENTINEL_:
|
||||
print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
warnings.warn("Set proxy via new config function: yf.set_config(proxy=proxy)", DeprecationWarning, stacklevel=2)
|
||||
self._data._set_proxy(proxy)
|
||||
proxy = _SENTINEL_
|
||||
|
||||
data = multi.download(self.symbols,
|
||||
start=start, end=end,
|
||||
|
||||
@@ -27,7 +27,7 @@ import re
|
||||
import re as _re
|
||||
import sys as _sys
|
||||
import threading
|
||||
from functools import lru_cache, wraps
|
||||
from functools import wraps
|
||||
from inspect import getmembers
|
||||
from types import FunctionType
|
||||
from typing import List, Optional
|
||||
@@ -50,13 +50,6 @@ def attributes(obj):
|
||||
if name[0] != '_' and name not in disallowed_names and hasattr(obj, name)}
|
||||
|
||||
|
||||
@lru_cache(maxsize=20)
|
||||
def print_once(msg):
|
||||
# 'warnings' module suppression of repeat messages does not work.
|
||||
# This function replicates correct behaviour
|
||||
print(msg)
|
||||
|
||||
|
||||
# Logging
|
||||
# Note: most of this logic is adding indentation with function depth,
|
||||
# so that DEBUG log is readable.
|
||||
@@ -181,18 +174,14 @@ def is_isin(string):
|
||||
return bool(_re.match("^([A-Z]{2})([A-Z0-9]{9})([0-9])$", string))
|
||||
|
||||
|
||||
def get_all_by_isin(isin, proxy=const._SENTINEL_, session=None):
|
||||
def get_all_by_isin(isin):
|
||||
if not (is_isin(isin)):
|
||||
raise ValueError("Invalid ISIN number")
|
||||
|
||||
if proxy is not const._SENTINEL_:
|
||||
print_once("YF deprecation warning: set proxy via new config function: yf.set_config(proxy=proxy)")
|
||||
proxy = None
|
||||
|
||||
# Deferred this to prevent circular imports
|
||||
from .search import Search
|
||||
|
||||
search = Search(query=isin, max_results=1, session=session, proxy=proxy)
|
||||
search = Search(query=isin, max_results=1)
|
||||
|
||||
# Extract the first quote and news
|
||||
ticker = search.quotes[0] if search.quotes else {}
|
||||
@@ -210,18 +199,18 @@ def get_all_by_isin(isin, proxy=const._SENTINEL_, session=None):
|
||||
}
|
||||
|
||||
|
||||
def get_ticker_by_isin(isin, proxy=const._SENTINEL_, session=None):
|
||||
data = get_all_by_isin(isin, proxy, session)
|
||||
def get_ticker_by_isin(isin):
|
||||
data = get_all_by_isin(isin)
|
||||
return data.get('ticker', {}).get('symbol', '')
|
||||
|
||||
|
||||
def get_info_by_isin(isin, proxy=const._SENTINEL_, session=None):
|
||||
data = get_all_by_isin(isin, proxy, session)
|
||||
def get_info_by_isin(isin):
|
||||
data = get_all_by_isin(isin)
|
||||
return data.get('ticker', {})
|
||||
|
||||
|
||||
def get_news_by_isin(isin, proxy=const._SENTINEL_, session=None):
|
||||
data = get_all_by_isin(isin, proxy, session)
|
||||
def get_news_by_isin(isin):
|
||||
data = get_all_by_isin(isin)
|
||||
return data.get('news', {})
|
||||
|
||||
|
||||
|
||||
@@ -1 +1 @@
|
||||
version = "0.2.58"
|
||||
version = "0.2.63"
|
||||
|
||||
Reference in New Issue
Block a user