Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug Fixes and Add Jupyter book as doc #21

Merged
merged 5 commits into from
Oct 20, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion .github/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,10 @@ changelog:
categories:
- title: New Features
labels:
- New-Feature
- New Feature
- title: Bug Fixes
labels:
- Bug Fix
- title: Other Changes
labels:
- "*"
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ share/python-wheels/
.installed.cfg
*.egg
MANIFEST
.vscode/

# PyInstaller
# Usually these files are written by a python script from a template
Expand Down
15 changes: 10 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# CP2KDATA
# CP2KData


[![Python package](https://github.com/robinzyb/cp2kdata/actions/workflows/ci.yml/badge.svg)](https://github.com/robinzyb/cp2kdata/actions/workflows/ci.yml)[![Coverage Status](https://coveralls.io/repos/github/robinzyb/cp2kdata/badge.svg)](https://coveralls.io/github/robinzyb/cp2kdata)
Expand All @@ -10,7 +10,7 @@ Python Package to postprocess cp2k data.

including cube file, pdos file, output file

- [CP2KDATA](#cp2kdata)
- [CP2KData](#cp2kdata)
- [Installation](#installation)
- [Generate Standard Test Inputs](#generate-standard-test-inputs)
- [Plot Standard Test Output](#plot-standard-test-output)
Expand Down Expand Up @@ -53,11 +53,16 @@ cp2kdata gen hubbardu <template input> <a list of other neccessary files> -ur u
cp2kdata gen hubbardu input.inp coord.xyz cp2k.lsf -ur 0 8.1 1 -e Fe -orb d
```
# Plot Standard Test Output
After you finished the above tests, you readily plot the result using command `cp2kdata plot cutoff`, `cp2kdata plot basis`, `cp2kdata plot hubbardu`
Once you have completed the tests mentioned above, you can easily generate plots of the results using the following commands:

- To plot the cutoff data, use: `cp2kdata plot cutoff`
- To plot the basis data, use: `cp2kdata plot basis`
- To plot the Hubbard U data, use: `cp2kdata plot hubbardu`


# Processing Other Files
[Processing CP2K Cube Files](./docs/cube/README.md)
[Processing CP2K Pdos Files](./docs/pdos/README.md)
- [Processing CP2K Cube Files](./docs/cube/README.md)
- [Processing CP2K Pdos Files](./docs/pdos/README.md)

# Processing Output Files

Expand Down
39 changes: 26 additions & 13 deletions cp2kdata/cube/cube.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ def square_wave_filter(x, l, cell_z):
return y

# parse cp2kcube
class Cp2kCube:
class Cp2kCubeOld:
"""
timestep: unit ps
"""
Expand Down Expand Up @@ -137,7 +137,7 @@ def quick_plot(self, axis="z", interpolate=False, output_dir="./"):
plt.savefig(os.path.join(output_dir, "pav.png"), dpi=100)


class Cp2kCubeNew(MSONable):
class Cp2kCube(MSONable):
# remove useless timestep argument
# add MSONable use as_dict and from_dict
# add copy method
Expand All @@ -149,6 +149,13 @@ class Cp2kCubeNew(MSONable):


def __init__(self, file=None, cube_vals=None, grid_size=None, grid_space=None):
print("Warning: This is New Cp2kCube Class, if you want to use old Cp2kCube")
print("try, from cp2kdata.cube.cube import Cp2kCube")
print("New Cp2kCube return raw values in cp2k cube file")
print("that is, length in bohr and energy in hartree for potential file")
print("that is, length in bohr and density in e/bohr^3 for density file")
print("to convert unit: try from cp2kdata.utils import au2A, au2eV")

self.file = file
self.cube_vals = self.read_cube_vals()
self.cell_x = self.grid_size[0]*self.grid_space[0]
Expand Down Expand Up @@ -176,9 +183,9 @@ def grid_space(self):
# read grid point and grid size, unit: angstrom
content_list = file_content(self.file, (3,6))
content_list = content_list.split()
step_x = float(content_list[1])*au2A
step_y = float(content_list[6])*au2A
step_z = float(content_list[11])*au2A
step_x = float(content_list[1])
step_y = float(content_list[6])
step_z = float(content_list[11])
return step_x, step_y, step_z

def as_dict(self):
Expand All @@ -196,7 +203,7 @@ def as_dict(self):
def __add__(self, others):
"""magic method for adding two Cp2kCube instances"""
self_copy = self.copy()
if isinstance(others, Cp2kCubeNew):
if isinstance(others, Cp2kCube):
other_copy = others.copy()
self_copy.cube_vals += other_copy.cube_vals
else:
Expand All @@ -206,7 +213,7 @@ def __add__(self, others):
def __sub__(self, others):
"""magic method for subtracting two Cp2kCube instances"""
self_copy = self.copy()
if isinstance(others, Cp2kCubeNew):
if isinstance(others, Cp2kCube):
other_copy = others.copy()
self_copy.cube_vals -= other_copy.cube_vals
else:
Expand Down Expand Up @@ -269,15 +276,21 @@ def get_pav(self, axis="z", interpolate=False):
else:
return points, vals

def get_mav(self, l1, l2=0, ncov=1, interpolate=False):
axis="z"
def get_mav(self, l1, l2=0, ncov=1, interpolate=False, axis="z"):
cell_length = {
"x": self.cell_x,
"y": self.cell_y,
"z": self.cell_z
}
length = cell_length[axis]

pav_x, pav = self.get_pav(axis=axis, interpolate=interpolate)
theta_1_fft = fft.fft(square_wave_filter(pav_x, l1, self.cell_z))
theta_1_fft = fft.fft(square_wave_filter(pav_x, l1, length))
pav_fft = fft.fft(pav)
mav_fft = pav_fft*theta_1_fft*self.cell_z/len(pav_x)
mav_fft = pav_fft*theta_1_fft*length/len(pav_x)
if ncov == 2:
theta_2_fft = fft.fft(square_wave_filter(pav_x, l2, self.cell_z))
mav_fft = mav_fft*theta_2_fft*self.cell_z/len(pav_x)
theta_2_fft = fft.fft(square_wave_filter(pav_x, l2, length))
mav_fft = mav_fft*theta_2_fft*length/len(pav_x)
mav = fft.ifft(mav_fft)
return pav_x, np.real(mav)

Expand Down
1 change: 1 addition & 0 deletions cp2kdata/output.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,7 @@ def __init__(self, output_file=None, run_type: str=None, path_prefix=".", **kwar
self.check_run_type(run_type=self.global_info.run_type)

run_type_parser_candidates = {
"ENERGY": self.parse_energy_force,
"ENERGY_FORCE": self.parse_energy_force,
"GEO_OPT": self.parse_geo_opt,
"CELL_OPT": self.parse_cell_opt,
Expand Down
2 changes: 1 addition & 1 deletion docs/cube/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# cp2kdata: Processing CP2K Cube Files
# Manipulate CP2K Cube Files

The `cp2kdata` Python package provides tools for working with cube files generated by the CP2K quantum chemistry software. One of the standout features of this package is its ability to handle CP2K cube files and perform various analyses.

Expand Down
6 changes: 3 additions & 3 deletions docs/pdos/README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# Processing PDOS File
# Manipulate CP2K Pdos Files

## Processing Single PDOS File

```python
from cp2kdata.pdos import Pdos
from cp2kdata import Cp2kPdos
dosfile = "Universality-ALPHA_k2-1_50.pdos"
mypdos = Pdos(dosfile)
mypdos = Cp2kPdos(dosfile)
dos, ener = mypdos.get_dos()
```

Expand Down
43 changes: 43 additions & 0 deletions jupyter-book/_config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Book settings
# Learn more at https://jupyterbook.org/customize/config.html

title: CP2KData Reference
author: Yong-Bin Zhuang
copyright: "2023"
#logo: logo.png

# Force re-execution of notebooks on each build.
# See https://jupyterbook.org/content/execute.html
execute:
execute_notebooks: force

# Define the name of the latex output file for PDF builds
# latex:
# latex_documents:
# targetname: book.tex

# Add a bibtex file so that we can create citations
bibtex_bibfiles:
- references.bib

# Information about where the book exists on the web
repository:
url: https://github.com/chenggroup/cp2kdata # Online location of your book
path_to_book: docs # Optional path to your book, relative to the repository root
branch: main # Which branch of the repository should be used when creating links (optional)

# Add GitHub buttons to your book
# See https://jupyterbook.org/customize/config.html#add-a-link-to-your-repository
html:
use_issues_button: true
use_repository_button: true


sphinx:
extra_extensions:
- 'sphinx.ext.autodoc'
- 'sphinx.ext.napoleon'
- 'sphinx.ext.viewcode'
- 'sphinx.ext.autosummary'
config:
autosummary_generate: True
14 changes: 14 additions & 0 deletions jupyter-book/_toc.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Table of contents
# Learn more at https://jupyterbook.org/customize/toc.html

format: jb-book
root: intro
parts:
- caption: Tools
chapters:
- file: docs/cube/README
- file: docs/pdos/README
- caption: API Reference
maxdepth: 1
chapters:
- file: _api/modules
1 change: 1 addition & 0 deletions jupyter-book/docs
1 change: 1 addition & 0 deletions jupyter-book/intro.md
56 changes: 56 additions & 0 deletions jupyter-book/references.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
---

@inproceedings{holdgraf_evidence_2014,
address = {Brisbane, Australia, Australia},
title = {Evidence for {Predictive} {Coding} in {Human} {Auditory} {Cortex}},
booktitle = {International {Conference} on {Cognitive} {Neuroscience}},
publisher = {Frontiers in Neuroscience},
author = {Holdgraf, Christopher Ramsay and de Heer, Wendy and Pasley, Brian N. and Knight, Robert T.},
year = {2014}
}

@article{holdgraf_rapid_2016,
title = {Rapid tuning shifts in human auditory cortex enhance speech intelligibility},
volume = {7},
issn = {2041-1723},
url = {http://www.nature.com/doifinder/10.1038/ncomms13654},
doi = {10.1038/ncomms13654},
number = {May},
journal = {Nature Communications},
author = {Holdgraf, Christopher Ramsay and de Heer, Wendy and Pasley, Brian N. and Rieger, Jochem W. and Crone, Nathan and Lin, Jack J. and Knight, Robert T. and Theunissen, Frédéric E.},
year = {2016},
pages = {13654},
file = {Holdgraf et al. - 2016 - Rapid tuning shifts in human auditory cortex enhance speech intelligibility.pdf:C\:\\Users\\chold\\Zotero\\storage\\MDQP3JWE\\Holdgraf et al. - 2016 - Rapid tuning shifts in human auditory cortex enhance speech intelligibility.pdf:application/pdf}
}

@inproceedings{holdgraf_portable_2017,
title = {Portable learning environments for hands-on computational instruction using container-and cloud-based technology to teach data science},
volume = {Part F1287},
isbn = {978-1-4503-5272-7},
doi = {10.1145/3093338.3093370},
abstract = {© 2017 ACM. There is an increasing interest in learning outside of the traditional classroom setting. This is especially true for topics covering computational tools and data science, as both are challenging to incorporate in the standard curriculum. These atypical learning environments offer new opportunities for teaching, particularly when it comes to combining conceptual knowledge with hands-on experience/expertise with methods and skills. Advances in cloud computing and containerized environments provide an attractive opportunity to improve the effciency and ease with which students can learn. This manuscript details recent advances towards using commonly-Available cloud computing services and advanced cyberinfrastructure support for improving the learning experience in bootcamp-style events. We cover the benets (and challenges) of using a server hosted remotely instead of relying on student laptops, discuss the technology that was used in order to make this possible, and give suggestions for how others could implement and improve upon this model for pedagogy and reproducibility.},
booktitle = {{ACM} {International} {Conference} {Proceeding} {Series}},
author = {Holdgraf, Christopher Ramsay and Culich, A. and Rokem, A. and Deniz, F. and Alegro, M. and Ushizima, D.},
year = {2017},
keywords = {Teaching, Bootcamps, Cloud computing, Data science, Docker, Pedagogy}
}

@article{holdgraf_encoding_2017,
title = {Encoding and decoding models in cognitive electrophysiology},
volume = {11},
issn = {16625137},
doi = {10.3389/fnsys.2017.00061},
abstract = {© 2017 Holdgraf, Rieger, Micheli, Martin, Knight and Theunissen. Cognitive neuroscience has seen rapid growth in the size and complexity of data recorded from the human brain as well as in the computational tools available to analyze this data. This data explosion has resulted in an increased use of multivariate, model-based methods for asking neuroscience questions, allowing scientists to investigate multiple hypotheses with a single dataset, to use complex, time-varying stimuli, and to study the human brain under more naturalistic conditions. These tools come in the form of “Encoding” models, in which stimulus features are used to model brain activity, and “Decoding” models, in which neural features are used to generated a stimulus output. Here we review the current state of encoding and decoding models in cognitive electrophysiology and provide a practical guide toward conducting experiments and analyses in this emerging field. Our examples focus on using linear models in the study of human language and audition. We show how to calculate auditory receptive fields from natural sounds as well as how to decode neural recordings to predict speech. The paper aims to be a useful tutorial to these approaches, and a practical introduction to using machine learning and applied statistics to build models of neural activity. The data analytic approaches we discuss may also be applied to other sensory modalities, motor systems, and cognitive systems, and we cover some examples in these areas. In addition, a collection of Jupyter notebooks is publicly available as a complement to the material covered in this paper, providing code examples and tutorials for predictive modeling in python. The aimis to provide a practical understanding of predictivemodeling of human brain data and to propose best-practices in conducting these analyses.},
journal = {Frontiers in Systems Neuroscience},
author = {Holdgraf, Christopher Ramsay and Rieger, J.W. and Micheli, C. and Martin, S. and Knight, R.T. and Theunissen, F.E.},
year = {2017},
keywords = {Decoding models, Encoding models, Electrocorticography (ECoG), Electrophysiology/evoked potentials, Machine learning applied to neuroscience, Natural stimuli, Predictive modeling, Tutorials}
}

@book{ruby,
title = {The Ruby Programming Language},
author = {Flanagan, David and Matsumoto, Yukihiro},
year = {2008},
publisher = {O'Reilly Media}
}
Loading