Skip to content

Commit b159bfa

Browse files
authored
fixed example in readme (#21)
* fixed example in readme * fixed hermitian example * using floats everywhere * removed extra spaces * removed float wrapper
1 parent 796b531 commit b159bfa

File tree

1 file changed

+63
-63
lines changed

1 file changed

+63
-63
lines changed

README.md

+63-63
Original file line numberDiff line numberDiff line change
@@ -13,49 +13,49 @@ For example, starting with a 6 by 6 matrix whose elements are numbered 1 to 36 i
1313
```julia
1414
julia> using LinearAlgebra, RectangularFullPacked
1515

16-
julia> A = reshape(1:36, (6, 6))
17-
6×6 reshape(::UnitRange{Int64}, 6, 6) with eltype Int64:
18-
1 7 13 19 25 31
19-
2 8 14 20 26 32
20-
3 9 15 21 27 33
21-
4 10 16 22 28 34
22-
5 11 17 23 29 35
23-
6 12 18 24 30 36
16+
julia> A = reshape(1.:36., (6, 6))
17+
6×6 reshape(::StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}, 6, 6) with eltype Float64:
18+
1.0 7.0 13.0 19.0 25.0 31.0
19+
2.0 8.0 14.0 20.0 26.0 32.0
20+
3.0 9.0 15.0 21.0 27.0 33.0
21+
4.0 10.0 16.0 22.0 28.0 34.0
22+
5.0 11.0 17.0 23.0 29.0 35.0
23+
6.0 12.0 18.0 24.0 30.0 36.0
2424
```
2525
the lower triangular matrix `AL` is constructed by replacing the elements above the diagonal with zero.
2626
```julia
2727
julia> AL = tril!(collect(A))
28-
6×6 Matrix{Int64}:
29-
1 0 0 0 0 0
30-
2 8 0 0 0 0
31-
3 9 15 0 0 0
32-
4 10 16 22 0 0
33-
5 11 17 23 29 0
34-
6 12 18 24 30 36
35-
```
36-
`AL` requires the same amount of storage as does `A` even though there are only 21 potential non-zeros in `AL`.
28+
6×6 Matrix{Float64}:
29+
1.0 0.0 0.0 0.0 0.0 0.0
30+
2.0 8.0 0.0 0.0 0.0 0.0
31+
3.0 9.0 15.0 0.0 0.0 0.0
32+
4.0 10.0 16.0 22.0 0.0 0.0
33+
5.0 11.0 17.0 23.0 29.0 0.0
34+
6.0 12.0 18.0 24.0 30.0 36.0
35+
```
36+
`AL` requires the same amount of storage as does `A` even though there are only 21 potential non-zeros in `AL`.
3737
The RFP version of the lower triangular matrix
38-
```julia
39-
julia> ArfpL = Int.(TriangularRFP(float.(A), :L))
40-
6×6 Matrix{Int64}:
41-
1 0 0 0 0 0
42-
2 8 0 0 0 0
43-
3 9 15 0 0 0
44-
4 10 16 22 0 0
45-
5 11 17 23 29 0
46-
6 12 18 24 30 36
38+
```julia
39+
julia> ArfpL = TriangularRFP(A, :L)
40+
6×6 TriangularRFP{Float64}:
41+
1.0 0.0 0.0 0.0 0.0 0.0
42+
2.0 8.0 0.0 0.0 0.0 0.0
43+
3.0 9.0 15.0 0.0 0.0 0.0
44+
4.0 10.0 16.0 22.0 0.0 0.0
45+
5.0 11.0 17.0 23.0 29.0 0.0
46+
6.0 12.0 18.0 24.0 30.0 36.0
4747
```
48-
provides the same displayed form but the underlying, "parent" array is 7 by 3
49-
```julia
50-
julia> ALparent = Int.(ArfpL.data)
51-
7×3 Matrix{Int64}:
52-
22 23 24
53-
1 29 30
54-
2 8 36
55-
3 9 15
56-
4 10 16
57-
5 11 17
58-
6 12 18
48+
provides the same displayed form but the underlying, "parent" array is 7 by 3
49+
```julia
50+
julia> ALparent = ArfpL.data
51+
7×3 Matrix{Float64}:
52+
22.0 23.0 24.0
53+
1.0 29.0 30.0
54+
2.0 8.0 36.0
55+
3.0 9.0 15.0
56+
4.0 10.0 16.0
57+
5.0 11.0 17.0
58+
6.0 12.0 18.0
5959
```
6060

6161
The three blocks of `AL` are the lower triangle of `AL[1:3, 1:3]`, stored as the lower triangle of `ALparent[2:4, :]`; the square block `AL[4:6, 1:3]` stored in `ALparent[5:7, :]`; and the lower triangle of `AL[4:6, 4:6]` stored transposed in `ALparent[1:3, :]`.
@@ -64,50 +64,50 @@ For odd values of n, the parent is of size `(n, div(n + 1, 2))` and the non-zero
6464

6565
For example,
6666
```julia
67-
julia> AL = tril!(collect(reshape(1:25, 5, 5)))
68-
5×5 Matrix{Int64}:
69-
1 0 0 0 0
70-
2 7 0 0 0
71-
3 8 13 0 0
72-
4 9 14 19 0
73-
5 10 15 20 25
67+
julia> AL = tril!(collect(reshape(1.:25., 5, 5)))
68+
5×5 Matrix{Float64}:
69+
1.0 0.0 0.0 0.0 0.0
70+
2.0 7.0 0.0 0.0 0.0
71+
3.0 8.0 13.0 0.0 0.0
72+
4.0 9.0 14.0 19.0 0.0
73+
5.0 10.0 15.0 20.0 25.0
7474

75-
julia> ArfpL = Int.(TriangularRFP(float(AL), :L).data)
76-
5×3 Matrix{Int64}:
77-
1 19 20
78-
2 7 25
79-
3 8 13
80-
4 9 14
81-
5 10 15
75+
julia> ArfpL = TriangularRFP(AL, :L).data
76+
5×3 Matrix{Float64}:
77+
1.0 19.0 20.0
78+
2.0 7.0 25.0
79+
3.0 8.0 13.0
80+
4.0 9.0 14.0
81+
5.0 10.0 15.0
8282
```
8383

84-
RFP storage is especially useful for large positive definite Hermitian matrices because the Cholesky factor can be evaluated nearly as quickly (by applying Level-3 BLAS to the blocks) as in full storage mode but requiring about half the storage.
84+
RFP storage is especially useful for large positive definite Hermitian matrices because the Cholesky factor can be evaluated nearly as quickly (by applying Level-3 BLAS to the blocks) as in full storage mode but requiring about half the storage.
8585

86-
A trivial example is
87-
```julia
88-
julia> A = [2 1 2; 1 2 0; 1 0 2]
89-
3×3 Matrix{Int64}:
90-
2 1 2
91-
1 2 0
92-
1 0 2
86+
A trivial example is
87+
```julia
88+
julia> A = [2. 1 2; 1 2 0; 1 0 2]
89+
3×3 Matrix{Float64}:
90+
2.0 1.0 2.0
91+
1.0 2.0 0.0
92+
1.0 0.0 2.0
9393

94-
julia> cholesky(Hermitian(A, :L))
94+
julia> cholesky(Hermitian(A, :L))
9595
Cholesky{Float64, Matrix{Float64}}
9696
L factor:
9797
3×3 LowerTriangular{Float64, Matrix{Float64}}:
9898
1.41421
9999
0.707107 1.22474
100100
0.707107 -0.408248 1.1547
101101

102-
julia> ArfpL = HermitianRFP(TriangularRFP(float.(A), :L))
102+
julia> ArfpL = Hermitian(TriangularRFP(A, :L))
103103
3×3 HermitianRFP{Float64}:
104104
2.0 1.0 1.0
105105
1.0 2.0 0.0
106106
1.0 0.0 2.0
107107

108-
julia> cholesky!(ArfpL).data
108+
julia> cholesky!(ArfpL).data
109109
3×2 Matrix{Float64}:
110110
1.41421 1.1547
111111
0.707107 1.22474
112112
0.707107 -0.408248
113-
```
113+
```

0 commit comments

Comments
 (0)