Skip to content

Commit 7ab3e74

Browse files
committed
refactor(cli): migrate from argparse to Typer
- Replace argparse with modern Typer CLI framework - Add Rich library integration for better table formatting - Improve command organization with sub-apps - Add proper type hints and parameter validation - Enhance error handling with Typer's built-in system - Improve help text and documentation - Add better table formatting using Rich - Simplify code structure using decorators This change modernizes the CLI interface while maintaining all existing functionality, making it more maintainable and user-friendly.
1 parent 4d88454 commit 7ab3e74

File tree

7 files changed

+633
-614
lines changed

7 files changed

+633
-614
lines changed

README-tests.md

Lines changed: 190 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,190 @@
1+
# Testing Best Practices for the Infactory SDK
2+
3+
Testing a client SDK like Infactory requires a multi-layered testing approach to ensure both the Python library and CLI work correctly. Here's a comprehensive testing strategy:
4+
5+
## 1. Unit Tests
6+
7+
### Python SDK Unit Tests
8+
9+
Unit tests should validate the individual components of your SDK without requiring actual API calls:
10+
11+
- **Test models**: Ensure model serialization/deserialization works correctly
12+
- **Test service classes**: Verify that API calls are constructed properly
13+
- **Test client initialization**: Check that configuration loading works as expected
14+
- **Test error handling**: Validate that API errors are caught and transformed properly
15+
16+
Use mock responses with a library like `unittest.mock` or `pytest-mock`:
17+
18+
```python
19+
def test_projects_list(mocker):
20+
# Mock the HTTP response
21+
mock_response = [{"id": "proj-123", "name": "Test Project", "team_id": "team-456"}]
22+
mock_get = mocker.patch("infactory_client.client.Client._get", return_value=mock_response)
23+
24+
# Create client and call the method
25+
client = Client(api_key="test_key")
26+
projects = client.projects.list(team_id="team-456")
27+
28+
# Assertions
29+
mock_get.assert_called_once_with("v1/projects", {"team_id": "team-456"})
30+
assert len(projects) == 1
31+
assert projects[0].id == "proj-123"
32+
assert projects[0].name == "Test Project"
33+
```
34+
35+
### CLI Unit Tests
36+
37+
For the CLI, test that commands properly parse arguments and call the appropriate SDK methods:
38+
39+
```python
40+
def test_projects_list_command(mocker):
41+
# Mock the projects.list method
42+
mock_projects = [MagicMock(id="proj-123", name="Test Project")]
43+
mock_client = MagicMock()
44+
mock_client.projects.list.return_value = mock_projects
45+
mocker.patch("infactory_cli.get_client", return_value=mock_client)
46+
47+
# Call the CLI command handler
48+
args = MagicMock(team_id="team-456")
49+
handle_projects_list(args)
50+
51+
# Assertions
52+
mock_client.projects.list.assert_called_once_with(team_id="team-456")
53+
```
54+
55+
## 2. Integration Tests
56+
57+
Integration tests validate that your SDK can properly interact with the API:
58+
59+
### Approach 1: Mock Server
60+
61+
Set up a mock server that mimics the Infactory API responses:
62+
63+
- Use tools like `responses`, `requests-mock`, or `httpx-mock` to intercept HTTP requests
64+
- Create a fixture with realistic API responses
65+
- Test complete workflows through multiple API calls
66+
67+
```python
68+
def test_create_and_publish_query_program(requests_mock):
69+
# Mock API responses
70+
requests_mock.post("https://api.infactory.ai/v1/queryprograms", json={"id": "qp-123", "name": "Test Query"})
71+
requests_mock.patch("https://api.infactory.ai/v1/queryprograms/qp-123/publish", json={"id": "qp-123", "published": True})
72+
73+
# Execute the workflow
74+
client = Client(api_key="test_key")
75+
query = client.query_programs.create(name="Test Query", dataline_id="dl-456", code="test code")
76+
published = client.query_programs.publish(query.id)
77+
78+
# Assertions
79+
assert published.id == "qp-123"
80+
assert published.published is True
81+
```
82+
83+
### Approach 2: VCR-style Tests
84+
85+
Record actual API responses and replay them in tests:
86+
87+
- Use `vcr.py` or `betamax` to record and replay HTTP interactions
88+
- Run tests against the actual API once, then replay for subsequent test runs
89+
- Provides realistic responses without hitting the API repeatedly
90+
91+
```python
92+
@vcr.use_cassette('fixtures/vcr_cassettes/project_list.yaml')
93+
def test_list_projects():
94+
client = Client(api_key="test_key")
95+
projects = client.projects.list(team_id="team-456")
96+
97+
assert len(projects) > 0
98+
assert projects[0].id is not None
99+
```
100+
101+
## 3. End-to-End (E2E) Tests
102+
103+
E2E tests validate complete user workflows against the actual API:
104+
105+
### Approach 1: Test Account
106+
107+
- Create a dedicated test account in the Infactory platform
108+
- Run automated tests against this account with real API calls
109+
- Test full workflows from start to finish
110+
111+
```python
112+
def test_e2e_datasource_workflow():
113+
# Use a test API key from environment variable
114+
client = Client(api_key=os.environ.get("NF_TEST_API_KEY"))
115+
116+
# Create a project
117+
project = client.projects.create(name="Test Project", team_id=os.environ.get("NF_TEST_TEAM_ID"))
118+
119+
# Create a datasource
120+
datasource = client.datasources.create(name="Test DB", project_id=project.id, type="postgres")
121+
122+
# List datasources
123+
datasources = client.datasources.list(project_id=project.id)
124+
125+
# Assertions
126+
assert any(ds.id == datasource.id for ds in datasources)
127+
128+
# Clean up
129+
client.datasources.delete(datasource.id)
130+
client.projects.delete(project.id)
131+
```
132+
133+
### Approach 2: CLI E2E Tests
134+
135+
Test the CLI commands against the actual API:
136+
137+
```python
138+
def test_cli_e2e():
139+
# Run CLI commands using subprocess
140+
result = subprocess.run(
141+
["nf", "login", "--key", os.environ.get("NF_TEST_API_KEY")],
142+
capture_output=True, text=True
143+
)
144+
assert "API key saved successfully" in result.stdout
145+
146+
result = subprocess.run(
147+
["nf", "projects", "list", "--team-id", os.environ.get("NF_TEST_TEAM_ID")],
148+
capture_output=True, text=True
149+
)
150+
assert "ID" in result.stdout
151+
```
152+
153+
## 4. Test Environment Setup
154+
155+
For comprehensive testing, set up:
156+
157+
1. **CI/CD Pipeline Integration**:
158+
- Run unit tests on every commit
159+
- Run integration tests on PRs
160+
- Run E2E tests on release branches
161+
162+
2. **Test Fixtures**:
163+
- Create reusable test data
164+
- Set up environment for realistic workflows
165+
- Implement automatic cleanup after tests
166+
167+
3. **Testing Matrix**:
168+
- Test across different Python versions (3.8, 3.9, 3.10, 3.11, 3.12)
169+
- Test on different operating systems (Windows, macOS, Linux)
170+
171+
## 5. Testing Recommendations
172+
173+
### When to Mock vs. Use Live Endpoints
174+
175+
- **Unit Tests**: Always use mocks
176+
- **Integration Tests**: Use recorded responses or a mock server
177+
- **E2E Tests**: Use a dedicated test account with live endpoints
178+
179+
### Best Practices
180+
181+
1. **Use a dedicated test account**: Don't use production credentials
182+
2. **Clean up test resources**: Delete any created resources after tests
183+
3. **Use fixture data**: Prepare test data for reproducible results
184+
4. **Make tests independent**: Each test should be able to run on its own
185+
5. **Use realistic data**: Test with data that resembles real-world usage
186+
6. **Test edge cases**: Error handling, rate limiting, authentication failures
187+
7. **Test CLI workflows**: Validate common command patterns
188+
8. **Focus on main workflows**: Prioritize testing the most common user flows
189+
190+
By implementing this testing strategy, you'll build confidence in your Infactory SDK and ensure a quality experience for your users across both the Python library and CLI interfaces.

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,7 @@ Results:
9595
$ nf query publish qp-789ghi
9696
Publishing query program qp-789ghi...
9797
Query program published successfully!
98-
Endpoint URL: https://i7y.dev/v1/live/monthly-sales/v1/data
98+
Endpoint URL: https://api.infactory.ai/v1/live/monthly-sales/v1/data
9999
```
100100

101101
### 7. Display the available endpoints
@@ -117,7 +117,7 @@ $ nf endpoints list --project-id my-project-id
117117
$ nf endpoints curl-example ep-123abc
118118
CURL example for endpoint ep-123abc:
119119

120-
curl -X GET "https://i7y.dev/v1/live/monthly-sales/v1/data" \
120+
curl -X GET "https://api.infactory.ai/v1/live/monthly-sales/v1/data" \
121121
-H "Authorization: Bearer YOUR_API_KEY" \
122122
-H "Content-Type: application/json"
123123
```
@@ -250,7 +250,7 @@ endpoints = client.apis.get_endpoints(api.id)
250250

251251
for ep in endpoints:
252252
print(f"Endpoint: {ep.name}")
253-
print(f"URL: https://i7y.dev/v1/live/{api.slug}/{api.version}/{ep.path}")
253+
print(f"URL: https://api.infactory.ai/v1/live/{api.slug}/{api.version}/{ep.path}")
254254
print(f"Method: {ep.http_method}")
255255
print(f"Description: {ep.description}")
256256
print("-" * 50)

0 commit comments

Comments
 (0)