Skip to content

ioannisPApapadopoulos/SumSpaces.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

80 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SumSpaces.jl

Build status (Github Actions) codecov.io

A Julia package for sum spaces.

This package contains ongoing research on sum spaces of extended and weighted orthogonal polynomials.

Sum spaces

We can construct the primal and dual sum spaces. The former consists of extended Chebyshev functions of the first kind and weighted Chebyshev polynomials of the second kind. The latter consists of extended Chebyshev functions of the second kind and weighted Chebyshev polynomials of the first kind.

julia> Sp = SumSpaceP() # Primal sum space
SumSpaceP{Float64, Vector{Float64}}

julia> Sp[0.1,1:5] # first 5 functions
5-element Vector{Float64}:
  1.0
  0.99498743710662
  0.1
  0.198997487421324
 -0.98

Note that apart from the first function. The sum space functions are grouped in twos following the pattern |wU_k, eT_k+1|. This is accessible as the columns of Sp are blocked a la BlockArrays.jl:

julia> Sp[0.1, Block.(1:3)]
3-blocked 5-element BlockArrays.PseudoBlockVector{Float64, Vector{Float64}, Tuple{BlockArrays.BlockedUnitRange{StepRange{Int64, Int64}}}}:
  1.0
 ──────────────────
  0.99498743710662
  0.1
 ──────────────────
  0.198997487421324
 -0.98

Operators

Quasimatrix operators can be deduced via the same API as ClassicalOrthogonalPolynomials.jl. For example consider the ℵ₀×ℵ₀ matrix, E, that maps the primal sum space to the dual sum space such that Sp = Sd*E:

julia> Sp = SumSpaceP(); Sd = SumSpaceD();
julia> Sd \ Sp
ℵ₀×ℵ₀-blocked ℵ₀×ℵ₀ BlockBandedMatrices.BandedBlockBandedMatrix{Float64, BlockArrays.PseudoBlockMatrix{Float64, LazyBandedMatrices.BlockHcat{Float64, Tuple{Vector{Float64}, LinearAlgebra.Adjoint{Float64, LazyBandedMatrices.BlockBroadcastArray{Float64, 2, typeof(hcat), Tuple{LazyArrays.BroadcastVector{Float64, typeof(*), Tuple{LazyArrays.BroadcastVector{Int64, typeof(^), Tuple{Int64, InfiniteArrays.InfUnitRange{Int64}}}, BlockArrays.BlockVector{Float64, FillArrays.Fill{FillArrays.Fill{Float64, 1, Tuple{Base.OneTo{Int64}}}, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}, Tuple{BlockArrays.BlockedUnitRange{InfiniteArrays.InfStepRange{Int64, Int64}}}}}}, BlockArrays.BlockVector{Float64, FillArrays.Fill{FillArrays.Zeros{Float64, 1, Tuple{Base.OneTo{Int64}}}, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}, Tuple{BlockArrays.BlockedUnitRange{InfiniteArrays.InfStepRange{Int64, Int64}}}}, BlockArrays.BlockVector{Float64, FillArrays.Fill{FillArrays.Zeros{Float64, 1, Tuple{Base.OneTo{Int64}}}, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}, Tuple{BlockArrays.BlockedUnitRange{InfiniteArrays.InfStepRange{Int64, Int64}}}}, BlockArrays.BlockVector{Float64, FillArrays.Fill{FillArrays.Zeros{Float64, 1, Tuple{Base.OneTo{Int64}}}, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}, Tuple{BlockArrays.BlockedUnitRange{InfiniteArrays.InfStepRange{Int64, Int64}}}}, LazyArrays.BroadcastVector{Float64, typeof(*), Tuple{LazyArrays.BroadcastVector{Int64, typeof(^), Tuple{Int64, InfiniteArrays.InfUnitRange{Int64}}}, BlockArrays.BlockVector{Float64, FillArrays.Fill{FillArrays.Fill{Float64, 1, Tuple{Base.OneTo{Int64}}}, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}, Tuple{BlockArrays.BlockedUnitRange{InfiniteArrays.InfStepRange{Int64, Int64}}}}}}, BlockArrays.BlockVector{Float64, FillArrays.Fill{FillArrays.Zeros{Float64, 1, Tuple{Base.OneTo{Int64}}}, 1, Tuple{InfiniteArrays.OneToInf{Int64}}}, Tuple{BlockArrays.BlockedUnitRange{InfiniteArrays.InfStepRange{Int64, Int64}}}}}}}}}, Tuple{BlockArrays.BlockedUnitRange{StepRange{Int64, Int64}}, BlockArrays.BlockedUnitRange{LazyArrays.ApplyArray{Int64, 1, typeof(vcat), Tuple{StepRange{Int64, Int64}, InfiniteArrays.InfStepRange{Int64, Int64}}}}}}, BlockArrays.BlockedUnitRange{LazyArrays.ApplyArray{Int64, 1, typeof(vcat), Tuple{StepRange{Int64, Int64}, InfiniteArrays.InfStepRange{Int64, Int64}}}}}:
 -1.0                                        
 ──────┼──────────────┼──────────────┼─────────────┼─────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼─────  
  0.00.5                                       
  0.00.0  -0.5                                    
 ──────┼──────────────┼──────────────┼─────────────┼─────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼─────
  0.00.0    0.5                                  
  1.00.0   0.00.0  -0.5                              
 ──────┼──────────────┼──────────────┼─────────────┼─────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼─────
   -0.5    0.0    0.5                                
   0.0   0.50.0   0.00.0  -0.5                         
 ──────┼──────────────┼──────────────┼─────────────┼─────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼─────
        -0.5    0.0    0.5                        
        0.0   0.50.0   0.00.0  -0.5                    
 ──────┼──────────────┼──────────────┼─────────────┼─────────────┼────────────┼────────────┼────────────┼────────────┼────────────┼─────
                                                                                                        

Expansion

Finding the expansion of a function in the sum space is nontrivial. We utilize the framework of "frames". We first construct a Gram matrix and then use a custom SVD solver to find the expansion.

julia> Sp = SumSpaceP(); N = 2; # Truncation degree
julia> M = 5001;  # Number of collocation points in [-1,1]
julia> Me = M ÷ 10;  # Number of collocation points in [-5,-1) and (1,5].
julia> x = collocation_points(M, Me, endpoints=[-5.,5]); # Collocation points
julia> A = framematrix(x, Sp, N); # Blocked frame matrix
julia> solvesvd(A, riemann(x, x->ExtendedChebyshevT()[x,2]))
7-element Vector{Float64}:
  1.1192393901633387e-16
 -4.3388889260090994e-16
  0.9999999999999982
 -8.881784197001306e-16
 -9.170833280340526e-16
  4.374428354941292e-16
 -4.718447854656949e-16

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages