Volatility/Volume ImpactWe often hear statements such as follow the big volume to project possible price movements. Or low volatility is good for trend. How much of it is statistically right for different markets. I wrote this small script to study the impact of Volatility and Volume on price movements.
Concept is as below:
Compare volume with a reference median value. You can also use moving average or other types for this comparison.
If volume is higher than median, increment positive value impact with change in close price. If volume is less than median, then increment negative value impact with change in close price.
With this we derive pvd and nvd which are measure of price change when volume is higher and lower respectively. pvd measures the price change when volume is higher than median whereas nvd measures price change when volume is lower than median.
Calculate correlation of pvd and nvd with close price to see what is impacting the price by higher extent.
Colors are applied to plots which have higher correlation to price movement. For example, if pvd has higher correlation to price movement, then pvd is coloured green whereas nvd is coloured silver. Similarly if nvd has higher correlation to price then nvd is coloured in red whereas pvd is coloured in silver.
Similar calculation also applied for volatility.
With this, you can observe how price change is correlated to high/low volume and volatility.
Let us see some examples on different markets.
Example 1: AMEX:SPY
From the chart snapshot below, it looks evident that SPY always thrive when there is low volatility and LOW VOLUME!!
Example 2: NASDAQ:TSLA
The picture will be different if you look at individual stocks. For Tesla, the price movement is more correlated to high volume (unlike SPY where low volume days define the trend)
Example 3: KUCOIN:BTCUSDT
Unlike stocks and indices, high volatility defined the trend for BTC for long time. It thrived when volatility is more. We can see that high volume is still major influencer in BTC price movements.
Settings are very simple and self explanatory.
Hint: You can also move the indicator to chart overlay for better visualisation of comparison with close price.
Statistics
SIMPLE CANDLESTICK PATTERN ALGO BACKTESTING - TESLA 4HMany traders spend a lot of time to create algorithms full of unrealistic and far from reality indicators and market conditions. With this script I want to help traders understand the advantage of the Pine language. Using indicators with no statistical foundation and creating algorithms with technical indicators and thousands of conditions is not always the right way to create an efficient tool.
With this script that we have called "SimpleBarPattern_LongOnly" we analyse the market through a simple condition, using bars or candles.
How it works
The condition is constructed as follows. You go long with 100% of the established capital and 0.03% commission. The first condition is that the minimum of the period under analysis falls below the opening level. The second condition is that the low of the period is below the low of the previous period. The third condition is that the close of the period is above the opening level. The final condition wants the current close to be higher than the previous open and higher than the previous close. We used a statistical approach in the creation of this script, some candlestick patterns that reflect these conditions are: Bullish Engulfing, Bullish Hammer and Morning Star .
This strategy aims to help traders make more accurate decisions while using candlesticks for their trading and scientifically demonstrates that candlesticks are valid statistical tools for financial analysis.
"SimpleBarPattern_LongOnly" is a very lightweight script created with Pine v5. We developed a user interface that can adjust the analysis period from a few days to several years.
The initial capital set is €1,000 (You can change this from the "Properties" section of the user interface).
Each individual trade uses 100% of the set capital, in this case €1,000.
The default commission per trade is 0.03% (You can change this in the "Properties" section of the user interface).
User Interface
1) General backtest time settings: Set the history period to be analysed
StartDate: backtest start date
StartMonth: backtest start month
StartYear: backtest start year
EndDate: backtest end day
EndMonth: backtest end month
EndYear: backtest end year
3) Stop Loss
4) Take Profit
Please do not hesitate to contact us for any questions or information.
Disclaimer
Be careful, the past is not a guarantee of future performance, so remember to use the script as a pure analysis tool. The developer takes no responsibility for any use other than research and analysis and can in no way be held liable for damages resulting from wrong use of this code.
Co-relation and St-deviation Strategy - BNB/USDT 15minThis indicator based on statistical analysis. it uses standard deviation and its co-relation to price action to generate signals. and following indicators has been used to calculate standard deviation and its co-relation values. finally it is capable to identify market changes in bottoms to pic most suitable points.
1. Parabolic SAR (parabolic stop and reverse)
2. Supertrend
3. Relative strength index (RSI)
4. Money flow index (MFI)
5. Balance of Power
6. Chande Momentum Oscillator
7. Center of Gravity (COG)
8. Directional Movement Index (DMI)
9. Stochastic
10. Symmetrically weighted moving average with fixed length
11. True strength index (TSI)
12. Williams %R
13. Accumulation/distribution index
14. Intraday Intensity Index
15. Negative Volume Index
16. Positive Volume Index
17. On Balance Volume
18. Price-Volume Trend
19. True range
20. Volume-weighted average price
21. Williams Accumulation/Distribution
22. Williams Variable Accumulation/Distribution
23. Simple Moving Average
24. Exponential Moving Average
25. CCI (commodity channel index)
26. Chop Zone
27. Ease of Movement
28. Detrended Price Oscillator
29. Advance Decline Line
30. Bull Bear Power
Performance Table From OpenThis indicator plots the percentage performance from the open of up to 20 different customizable tickers.
Enjoy!
Financial GrowthThis indicator will acquire the financial data provided by Tradview.
the data is compare between Quarter, Annual and TTM in term of percent of growth.
YoY, QoQ and CAGR is also available by this script (The minimum is 4).
in addition, ploting of data, label and table also available (you can check the mark to toggle on / off).
Data : Revenue, Net Income, EBITDA, EPS, DVPS, Free Cash Flow and Forward PE .
How to use it.
just select the financial data, period and size of data to compare.
you can check the box to toggle the plotting line, label and table.
Enjoy.
Position & Lot Size CalculatorBuilt with love "Position & Lot Size Calculator"
This indiator will help you to calculate your position size for managing the risk
Features :
1. Click-able Price Entry & SL (Easier Interface)
How to use it :
1. After add the indicator, set the Entry & SL Price with click price line in the chart
2. Set the risk and another parameter
Regards,
Hanabil
Days to expected ROICounts days to expected Return Of Investment from any given time.
Works only with "1 Day" resolution
Compound Indicator Strategy - BTC/USDT 3hThis is an Strategy finds and utilise end points of short term market trends and this is a combination of many indicators such as
1. Volume change oscillator
2. Money flow index ( MFI )
3. Momentum Oscillator (MOM)
4. Stochastic Indicator
6. Relative Strength Indicator ( RSI )
7. Relative volatility index (RVI)
8. Balance of power (BOP)
9. Small moving average ( SMA )
10. Exponential moving average ( EMA )
11. Parabolic SAR
12. Super trend indicator
this script forms a compound indicator after analysing movements of those indicators through different time frames and measure its co-relation and variance with the price action. buy doing that, indicator in a position to identify short term market reversals and presented.
after generating a common indicator, it evaluates standard deviation and standard variance with currant market price action and generates a buy and sell signals. you can determine your own trading method based on available options.
Relative Standard DeviationStandard Deviation is a common measure of volatility (the dispersion of data relative to its mean). However, when using it as an indicator, it can be more useful at times to know the deviation relative to the price as a percentage versus the hard value. This normalizes the data so that it is easier to compare the deviation of different assets. By definition, standard deviation is the square root of the variance, and it is how far the price is from the mean 68.2% of the time when there is normative distribution.
What it does : This indicator will tell you the standard deviation of the asset relative to its price (as a %), but also has the option to plot the normal (population) standard deviation.
Example use case : The regular standard deviation of Asset A is $12 and Asset B is $10. Which one is more volatile? Well, it depends on the asset price. If asset A just closed at $900 and asset B just closed at $30, that makes a big difference. In this instance Asset A $12/$900=1.33% (standard deviation relative to the asset price). Asset B $10/$30=33.33% (standard deviation relative to the asset price). Using a normal standard deviation indicator, you would just see that the standard deviation of Asset A is higher as a hard dollar value, when the reality is that Asset A is much less volatile.
How to use it : This indicator plots a blue line by default that is the Relative Standard Deviation of the asset compared to the asset price (a %). There is also an option to turn on / plot regular (population) Standard Deviation, which will plot as a purple line. The mean length used for the average, and the lookback period that the indicator uses to calculate, are both adjustable with inputs.
Leverage CalculatorThis script is intended to be used as a risk management calculator.
It will calculate the best leverage to use based on the maximum percentage of loss you are willing to incur on your trading portfolio.
Also calculates the order value and order qty based on your inputs.
Please note this calculator does not take into account any trading fees imposed by the exchange you are using.
*** Only risking 1% to 5% of your portfolio is considered good risk management ***
*** Not financial advice ***
------ Settings Inputs -----------------------------------------------------------------------------------------------------
"Portfolio Size" -- enter your portfolio balance
"% Willing to lose on this trade" -- enter the percent of your portfolio you are willing to lose if the stop loss is hit
"Entry Price" -- enter the price at which you will enter the trade
"Stop Loss Price" -- enter the price at which your stop loss will be set
----------------------------------------------------------------------------------------------------------------------------
------ Outputs -------------------------------------------------------------------------------------------------------------
"Portfolio" -- displays the portfolio balance entered in settings
"max loss on trade" -- displays the % loss entered in settings and the corresponding amount of your portfolio
"Entry Price" -- displays the entry price entered in settings
"Stop Loss Price" -- displays the stop loss price entered in settings
"Stop Loss %" -- displays the calculated percentage loss from the entry price
"Leverage calc" -- displays the calculated leverage based on your max loss and stop loss settings
"Order Value" -- displays the value of the order based on the calculated leverage
"Order Qty" -- displays the calculated order qty based on the calculated leverage
VOLD-MarketBreadth-RatioThis script provides NASDAQ and NYSE Up Volume (volume in rising stocks) and Down Volume (volume in falling stocks) ratio. Up Volume is higher than Down Volume, then you would see green label with ratio e.g 3.5:1. This means Up Volume is 3.5 times higher than Down Volume - Positive Market Breadth. If Down Volume is higher than Up Volume, then you would see red label with ratio e.g -4.5:1. This means Down Volume is 4.5 times higher than Up Volume.
For example, ratio is 1:1, then it is considered Market Breadth is Neutral.
PS: Currently TradingView provides only NASDAQ Composite Market volume data. I have requested them to provide Primary NASDAQ volume data. If they respond with new ticket for primary NQ data, I will update the script and publish the updated version. So if you have got similar table on ToS, you would see minor difference in NQ ratio.
Max drawdown daysA friendly reminder to myself and rest of the traders that market can stay low for prolonged time!!
Details are pretty simple.
Here is small comparison of the stats for major US indices.
Can also be applied to stocks. Cells are highlighted in red background ii
Drawdown/Recovery is still in progress for historical stats
Current drawdown bars/recovery bars are higher than that of median of All time stats.
MathProbabilityDistributionLibrary "MathProbabilityDistribution"
Probability Distribution Functions.
name(idx) Indexed names helper function.
Parameters:
idx : int, position in the range (0, 6).
Returns: string, distribution name.
usage:
.name(1)
Notes:
(0) => 'StdNormal'
(1) => 'Normal'
(2) => 'Skew Normal'
(3) => 'Student T'
(4) => 'Skew Student T'
(5) => 'GED'
(6) => 'Skew GED'
zscore(position, mean, deviation) Z-score helper function for x calculation.
Parameters:
position : float, position.
mean : float, mean.
deviation : float, standard deviation.
Returns: float, z-score.
usage:
.zscore(1.5, 2.0, 1.0)
std_normal(position) Standard Normal Distribution.
Parameters:
position : float, position.
Returns: float, probability density.
usage:
.std_normal(0.6)
normal(position, mean, scale) Normal Distribution.
Parameters:
position : float, position in the distribution.
mean : float, mean of the distribution, default=0.0 for standard distribution.
scale : float, scale of the distribution, default=1.0 for standard distribution.
Returns: float, probability density.
usage:
.normal(0.6)
skew_normal(position, skew, mean, scale) Skew Normal Distribution.
Parameters:
position : float, position in the distribution.
skew : float, skewness of the distribution.
mean : float, mean of the distribution, default=0.0 for standard distribution.
scale : float, scale of the distribution, default=1.0 for standard distribution.
Returns: float, probability density.
usage:
.skew_normal(0.8, -2.0)
ged(position, shape, mean, scale) Generalized Error Distribution.
Parameters:
position : float, position.
shape : float, shape.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.ged(0.8, -2.0)
skew_ged(position, shape, skew, mean, scale) Skew Generalized Error Distribution.
Parameters:
position : float, position.
shape : float, shape.
skew : float, skew.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.skew_ged(0.8, 2.0, 1.0)
student_t(position, shape, mean, scale) Student-T Distribution.
Parameters:
position : float, position.
shape : float, shape.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.student_t(0.8, 2.0, 1.0)
skew_student_t(position, shape, skew, mean, scale) Skew Student-T Distribution.
Parameters:
position : float, position.
shape : float, shape.
skew : float, skew.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
Returns: float, probability.
usage:
.skew_student_t(0.8, 2.0, 1.0)
select(distribution, position, mean, scale, shape, skew, log) Conditional Distribution.
Parameters:
distribution : string, distribution name.
position : float, position.
mean : float, mean, default=0.0 for standard distribution.
scale : float, scale, default=1.0 for standard distribution.
shape : float, shape.
skew : float, skew.
log : bool, if true apply log() to the result.
Returns: float, probability.
usage:
.select('StdNormal', __CYCLE4F__, log=true)
MovingAveragesLibrary "MovingAverages"
Contains utilities for generating moving average values including getting a moving average by name and a function for generating a Volume-Adjusted WMA.
sma(_D, _len) Simple Moving Avereage
Parameters:
_D : The series to measure from.
_len : The number of bars to measure with.
ema(_D, _len) Exponential Moving Avereage
Parameters:
_D : The series to measure from.
_len : The number of bars to measure with.
rma(_D, _len) RSI Moving Avereage
Parameters:
_D : The series to measure from.
_len : The number of bars to measure with.
wma(_D, _len) Weighted Moving Avereage
Parameters:
_D : The series to measure from.
_len : The number of bars to measure with.
vwma(_D, _len) volume-weighted Moving Avereage
Parameters:
_D : The series to measure from. Default is 'close'.
_len : The number of bars to measure with.
alma(_D, _len) Arnaud Legoux Moving Avereage
Parameters:
_D : The series to measure from. Default is 'close'.
_len : The number of bars to measure with.
cma(_D, _len, C, compound) Coefficient Moving Avereage (CMA) is a variation of a moving average that can simulate SMA or WMA with the advantage of previous data.
Parameters:
_D : The series to measure from. Default is 'close'.
_len : The number of bars to measure with.
C : The coefficient to use when averaging. 0 behaves like SMA, 1 behaves like WMA.
compound : When true (default is false) will use a compounding method for weighting the average.
dema(_D, _len) Double Exponential Moving Avereage
Parameters:
_D : The series to measure from. Default is 'close'.
_len : The number of bars to measure with.
zlsma(_D, _len) Arnaud Legoux Moving Avereage
Parameters:
_D : The series to measure from. Default is 'close'.
_len : The number of bars to measure with.
zlema(_D, _len) Arnaud Legoux Moving Avereage
Parameters:
_D : The series to measure from. Default is 'close'.
_len : The number of bars to measure with.
get(type, len, src) Generates a moving average based upon a 'type'.
Parameters:
type : The type of moving average to generate. Values allowed are: SMA, EMA, WMA, VWMA and VAWMA.
len : The number of bars to measure with.
src : The series to measure from. Default is 'close'.
Returns: The moving average series requested.
CRYPTO DASHBOARD Gs₿A Simple Crypto Dashboard/Screener which indicates the Price and percentage changes for the Given Period of time i.e for 1 Hr ,4 Hrs, 1 Day, 3 Days, 3 Weeks and 3 - 12 Months. By Default it displays #BTC and its Dominance and current trading pair Price and % changes.
WhaleCrew Binance Open InterestShows Open Interest of ANY Binance pair (BTCUSD, ETHUSD, ADAUSD, ...).
Inverse and USDT pairs
Preset-Pairs (BTC, ETH, XRP, ADA, SOL, DOT, ...)
Custom Candle Colors (candles can be turned off)
eStrategyLibrary "eStrategy"
Library contains methods which can help build custom strategy for continuous investment plans and also compare it with systematic buy and hold.
sip(startYear, initialDeposit, depositFrequency, recurringDeposit, buyPrice) Depicts systematic buy and hold over period of time
Parameters:
startYear : Year on which SIP is started
initialDeposit : Initial one time investment at the start
depositFrequency : Frequency of recurring deposit - can be monthly or weekly
recurringDeposit : Recurring deposit amount
buyPrice : Indicatinve buy price. Use high to be conservative. low, close, open, hl2, hlc3, ohlc4, hlcc4 are other options.
Returns: totalInvestment - initial + recurring deposits
totalQty - Quantity of units held for given instrument
totalEquity - Present equity
customStrategy(startYear, initialDeposit, depositFrequency, recurringDeposit, buyPrice, sellPrice, initialInvestmentPercent, recurringInvestmentPercent, signal, tradePercent) Allows users to define custom strategy and enhance systematic buy and hold by adding take profit and reloads
Parameters:
startYear : Year on which SIP is started
initialDeposit : Initial one time investment at the start
depositFrequency : Frequency of recurring deposit - can be monthly or weekly
recurringDeposit : Recurring deposit amount
buyPrice : Indicatinve buy price. Use high to be conservative. low, close, open, hl2, hlc3, ohlc4, hlcc4 are other options.
sellPrice : Indicatinve sell price. Use low to be conservative. high, close, open, hl2, hlc3, ohlc4, hlcc4 are other options.
initialInvestmentPercent : percent of amount to invest from the initial depost. Keep rest of them as cash
recurringInvestmentPercent : percent of amount to invest from recurring deposit. Keep rest of them as cash
signal : can be 1, -1 or 0. 1 means buy/reload. -1 means take profit and 0 means neither.
tradePercent : percent of amount to trade when signal is not 0. If taking profit, it will sell the percent from existing position. If reloading, it will buy with percent from cash reserve
Returns: totalInvestment - initial + recurring deposits
totalQty - Quantity of units held for given instrument
totalCash = Amount of cash held
totalEquity - Overall equity = totalQty*close + totalCash
JohnEhlersFourierTransformLibrary "JohnEhlersFourierTransform"
Fourier Transform for Traders By John Ehlers, slightly modified to allow to inspect other than the 8-50 frequency spectrum.
reference:
www.mesasoftware.com
high_pass_filter(source) Detrended version of the data by High Pass Filtering with a 40 Period cutoff
Parameters:
source : float, data source.
Returns: float.
transformed_dft(source, start_frequency, end_frequency) DFT by John Elhers.
Parameters:
source : float, data source.
start_frequency : int, lower bound of the frequency window, must be a positive number >= 0, window must be less than or 30.
end_frequency : int, upper bound of the frequency window, must be a positive number >= 0, window must be less than or 30.
Returns: tuple with float, float array.
db_to_rgb(db, transparency) converts the frequency decibels to rgb.
Parameters:
db : float, decibels value.
transparency : float, transparency value.
Returns: color.
FunctionCosineSimilarityLibrary "FunctionCosineSimilarity"
Cosine Similarity method.
function(sample_a, sample_b) Measure the similarity of 2 vectors.
Parameters:
sample_a : float array, values.
sample_b : float array, values.
Returns: float.
diss(cosim) Dissimilarity helper function.
Parameters:
cosim : float, cosine similarity value (0 > 1)
Returns: float
historicalrangeLibrary "historicalrange"
Library provices a method to calculate historical percentile range of series.
hpercentrank(source) calculates historical percentrank of the source
Parameters:
source : Source for which historical percentrank needs to be calculated. Source should be ranging between 0-100. If using a source which can beyond 0-100, use short term percentrank to baseline them.
Returns: pArray - percentrank array which contains how many instances of source occurred at different levels.
upperPercentile - percentile based on higher value
lowerPercentile - percentile based on lower value
median - median value of the source
max - max value of the source
distancefromath(source) returns stats on historical distance from ath in terms of percentage
Parameters:
source : for which stats are calculated
Returns: percentile and related historical stats regarding distance from ath
distancefromma(maType, length, source) returns stats on historical distance from moving average in terms of percentage
Parameters:
maType : Moving Average Type : Can be sma, ema, hma, rma, wma, vwma, swma, highlow, linreg, median
length : Moving Average Length
source : for which stats are calculated
Returns: percentile and related historical stats regarding distance from ath
bpercentb(source, maType, length, multiplier, sticky) returns percentrank and stats on historical bpercentb levels
Parameters:
source : Moving Average Source
maType : Moving Average Type : Can be sma, ema, hma, rma, wma, vwma, swma, highlow, linreg, median
length : Moving Average Length
multiplier : Standard Deviation multiplier
sticky : - sticky boundaries which will only change when value is outside boundary.
Returns: percentile and related historical stats regarding Bollinger Percent B
kpercentk(source, maType, length, multiplier, useTrueRange, sticky) returns percentrank and stats on historical kpercentk levels
Parameters:
source : Moving Average Source
maType : Moving Average Type : Can be sma, ema, hma, rma, wma, vwma, swma, highlow, linreg, median
length : Moving Average Length
multiplier : Standard Deviation multiplier
useTrueRange : - if set to false, uses high-low.
sticky : - sticky boundaries which will only change when value is outside boundary.
Returns: percentile and related historical stats regarding Keltener Percent K
dpercentd(useAlternateSource, alternateSource, length, sticky) returns percentrank and stats on historical dpercentd levels
Parameters:
useAlternateSource : - Custom source is used only if useAlternateSource is set to true
alternateSource : - Custom source
length : - donchian channel length
sticky : - sticky boundaries which will only change when value is outside boundary.
Returns: percentile and related historical stats regarding Donchian Percent D
oscillator(type, length, shortLength, longLength, source, highSource, lowSource, method, highlowLength, sticky) oscillator - returns Choice of oscillator with custom overbought/oversold range
Parameters:
type : - oscillator type. Valid values : cci, cmo, cog, mfi, roc, rsi, stoch, tsi, wpr
length : - Oscillator length - not used for TSI
shortLength : - shortLength only used for TSI
longLength : - longLength only used for TSI
source : - custom source if required
highSource : - custom high source for stochastic oscillator
lowSource : - custom low source for stochastic oscillator
method : - Valid values for method are : sma, ema, hma, rma, wma, vwma, swma, highlow, linreg, median
highlowLength : - length on which highlow of the oscillator is calculated
sticky : - overbought, oversold levels won't change unless crossed
Returns: percentile and related historical stats regarding oscillator
WIPNNetworkLibrary "WIPNNetwork"
this is a work in progress (WIP) and prone to have some errors, so use at your own risk...
let me know if you find any issues..
Method for a generalized Neural Network.
network(x) Generalized Neural Network Method.
Parameters:
x : TODO: add parameter x description here
Returns: TODO: add what function returns
FunctionPatternDecompositionLibrary "FunctionPatternDecomposition"
Methods for decomposing price into common grid/matrix patterns.
series_to_array(source, length) Helper for converting series to array.
Parameters:
source : float, data series.
length : int, size.
Returns: float array.
smooth_data_2d(data, rate) Smooth data sample into 2d points.
Parameters:
data : float array, source data.
rate : float, default=0.25, the rate of smoothness to apply.
Returns: tuple with 2 float arrays.
thin_points(data_x, data_y, rate) Thin the number of points.
Parameters:
data_x : float array, points x value.
data_y : float array, points y value.
rate : float, default=2.0, minimum threshold rate of sample stdev to accept points.
Returns: tuple with 2 float arrays.
extract_point_direction(data_x, data_y) Extract the direction each point faces.
Parameters:
data_x : float array, points x value.
data_y : float array, points y value.
Returns: float array.
find_corners(data_x, data_y, rate) ...
Parameters:
data_x : float array, points x value.
data_y : float array, points y value.
rate : float, minimum threshold rate of data y stdev.
Returns: tuple with 2 float arrays.
grid_coordinates(data_x, data_y, m_size) transforms points data to a constrained sized matrix format.
Parameters:
data_x : float array, points x value.
data_y : float array, points y value.
m_size : int, default=10, size of the matrix.
Returns: flat 2d pseudo matrix.
statisticsLibrary "statistics"
General statistics library.
erf(x) The "error function" encountered in integrating the normal
distribution (which is a normalized form of the Gaussian function).
Parameters:
x : The input series.
Returns: The Error Function evaluated for each element of x.
erfc(x)
Parameters:
x : The input series
Returns: The Complementary Error Function evaluated for each alement of x.
sumOfReciprocals(src, len) Calculates the sum of the reciprocals of the series.
For each element 'elem' in the series:
sum += 1/elem
Should the element be 0, the reciprocal value of 0 is used instead
of NA.
Parameters:
src : The input series.
len : The length for the sum.
Returns: The sum of the resciprocals of 'src' for 'len' bars back.
mean(src, len) The mean of the series.
(wrapper around ta.sma).
Parameters:
src : The input series.
len : The length for the mean.
Returns: The mean of 'src' for 'len' bars back.
average(src, len) The mean of the series.
(wrapper around ta.sma).
Parameters:
src : The input series.
len : The length for the average.
Returns: The average of 'src' for 'len' bars back.
geometricMean(src, len) The Geometric Mean of the series.
The geometric mean is most important when using data representing
percentages, ratios, or rates of change. It cannot be used for
negative numbers
Since the pure mathematical implementation generates a very large
intermediate result, we performed the calculation in log space.
Parameters:
src : The input series.
len : The length for the geometricMean.
Returns: The geometric mean of 'src' for 'len' bars back.
harmonicMean(src, len) The Harmonic Mean of the series.
The harmonic mean is most applicable to time changes and, along
with the geometric mean, has been used in economics for price
analysis. It is more difficult to calculate; therefore, it is less
popular than eiter of the other averages.
0 values are ignored in the calculation.
Parameters:
src : The input series.
len : The length for the harmonicMean.
Returns: The harmonic mean of 'src' for 'len' bars back.
median(src, len) The median of the series.
(a wrapper around ta.median)
Parameters:
src : The input series.
len : The length for the median.
Returns: The median of 'src' for 'len' bars back.
variance(src, len, biased) The variance of the series.
Parameters:
src : The input series.
len : The length for the variance.
biased : Wether to use the biased calculation (for a population), or the
unbiased calculation (for a sample set). .
Returns: The variance of 'src' for 'len' bars back.
stdev(src, len, biased) The standard deviation of the series.
Parameters:
src : The input series.
len : The length for the stdev.
biased : Wether to use the biased calculation (for a population), or the
unbiased calculation (for a sample set). .
Returns: The standard deviation of 'src' for 'len' bars back.
skewness(src, len) The skew of the series.
Skewness measures the amount of distortion from a symmetric
distribution, making the curve appear to be short on the left
(lower prices) and extended to the right (higher prices). The
extended side, either left or right is called the tail, and a
longer tail to the right is called positive skewness. Negative
skewness has the tail extending towards the left.
Parameters:
src : The input series.
len : The length for the skewness.
Returns: The skewness of 'src' for 'len' bars back.
kurtosis(src, len) The kurtosis of the series.
Kurtosis describes the peakedness or flatness of a distribution.
This can be used as an unbiased assessment of whether prices are
trending or moving sideways. Trending prices will ocver a wider
range and thus a flatter distribution (kurtosis < 3; negative
kurtosis). If prices are range-bound, there will be a clustering
around the mean and we have positive kurtosis (kurtosis > 3)
Parameters:
src : The input series.
len : The length for the kurtosis.
Returns: The kurtosis of 'src' for 'len' bars back.
excessKurtosis(src, len) The normalized kurtosis of the series.
kurtosis > 0 --> positive kurtosis --> trending
kurtosis < 0 --> negative krutosis --> range-bound
Parameters:
src : The input series.
len : The length for the excessKurtosis.
Returns: The excessKurtosis of 'src' for 'len' bars back.
normDist(src, len, value) Calculates the probability mass for the value according to the
src and length. It calculates the probability for value to be
present in the normal distribution calculated for src and length.
Parameters:
src : The input series.
len : The length for the normDist.
value : The series of values to calculate the normal distance for
Returns: The normal distance of 'value' to 'src' for 'len' bars back.
normDistCumulative(src, len, value) Calculates the cumulative probability mass for the value according
to the src and length. It calculates the cumulative probability for
value to be present in the normal distribution calculated for src
and length.
Parameters:
src : The input series.
len : The length for the normDistCumulative.
value : The series of values to calculate the cumulative normal distance
for
Returns: The cumulative normal distance of 'value' to 'src' for 'len' bars
back.
zScore(src, len, value) Returns then z-score of objective to the series src.
It returns the number of stdev's the objective is away from the
mean(src, len)
Parameters:
src : The input series.
len : The length for the zScore.
value : The series of values to calculate the cumulative normal distance
for
Returns: The z-score of objectiv with respect to src and len.
er(src, len) Calculates the efficiency ratio of the series.
It measures the noise of the series. The lower the number, the
higher the noise.
Parameters:
src : The input series.
len : The length for the efficiency ratio.
Returns: The efficiency ratio of 'src' for 'len' bars back.
efficiencyRatio(src, len) Calculates the efficiency ratio of the series.
It measures the noise of the series. The lower the number, the
higher the noise.
Parameters:
src : The input series.
len : The length for the efficiency ratio.
Returns: The efficiency ratio of 'src' for 'len' bars back.
fractalEfficiency(src, len) Calculates the efficiency ratio of the series.
It measures the noise of the series. The lower the number, the
higher the noise.
Parameters:
src : The input series.
len : The length for the efficiency ratio.
Returns: The efficiency ratio of 'src' for 'len' bars back.
mse(src, len) Calculates the Mean Squared Error of the series.
Parameters:
src : The input series.
len : The length for the mean squared error.
Returns: The mean squared error of 'src' for 'len' bars back.
meanSquaredError(src, len) Calculates the Mean Squared Error of the series.
Parameters:
src : The input series.
len : The length for the mean squared error.
Returns: The mean squared error of 'src' for 'len' bars back.
rmse(src, len) Calculates the Root Mean Squared Error of the series.
Parameters:
src : The input series.
len : The length for the root mean squared error.
Returns: The root mean squared error of 'src' for 'len' bars back.
rootMeanSquaredError(src, len) Calculates the Root Mean Squared Error of the series.
Parameters:
src : The input series.
len : The length for the root mean squared error.
Returns: The root mean squared error of 'src' for 'len' bars back.
mae(src, len) Calculates the Mean Absolute Error of the series.
Parameters:
src : The input series.
len : The length for the mean absolute error.
Returns: The mean absolute error of 'src' for 'len' bars back.
meanAbsoluteError(src, len) Calculates the Mean Absolute Error of the series.
Parameters:
src : The input series.
len : The length for the mean absolute error.
Returns: The mean absolute error of 'src' for 'len' bars back.