deepmd_utils.model_format package

class deepmd_utils.model_format.DescrptSeA(rcut: float, rcut_smth: float, sel: List[int], neuron: List[int] = [24, 48, 96], axis_neuron: int = 8, resnet_dt: bool = False, trainable: bool = True, type_one_side: bool = True, exclude_types: List[List[int]] = [], set_davg_zero: bool = False, activation_function: str = 'tanh', precision: str = 'float64', spin: Optional[Any] = None)[source]

Bases: NativeOP

DeepPot-SE constructed from all information (both angular and radial) of atomic configurations. The embedding takes the distance between atoms as input.

The descriptor \(\mathcal{D}^i \in \mathcal{R}^{M_1 \times M_2}\) is given by [1]

\[\mathcal{D}^i = (\mathcal{G}^i)^T \mathcal{R}^i (\mathcal{R}^i)^T \mathcal{G}^i_<\]

where \(\mathcal{R}^i \in \mathbb{R}^{N \times 4}\) is the coordinate matrix, and each row of \(\mathcal{R}^i\) can be constructed as follows

\[(\mathcal{R}^i)_j = [ \begin{array}{c} s(r_{ji}) & \frac{s(r_{ji})x_{ji}}{r_{ji}} & \frac{s(r_{ji})y_{ji}}{r_{ji}} & \frac{s(r_{ji})z_{ji}}{r_{ji}} \end{array} ]\]

where \(\mathbf{R}_{ji}=\mathbf{R}_j-\mathbf{R}_i = (x_{ji}, y_{ji}, z_{ji})\) is the relative coordinate and \(r_{ji}=\lVert \mathbf{R}_{ji} \lVert\) is its norm. The switching function \(s(r)\) is defined as:

\[\begin{split}s(r)= \begin{cases} \frac{1}{r}, & r<r_s \\ \frac{1}{r} \{ {(\frac{r - r_s}{ r_c - r_s})}^3 (-6 {(\frac{r - r_s}{ r_c - r_s})}^2 +15 \frac{r - r_s}{ r_c - r_s} -10) +1 \}, & r_s \leq r<r_c \\ 0, & r \geq r_c \end{cases}\end{split}\]

Each row of the embedding matrix \(\mathcal{G}^i \in \mathbb{R}^{N \times M_1}\) consists of outputs of a embedding network \(\mathcal{N}\) of \(s(r_{ji})\):

\[(\mathcal{G}^i)_j = \mathcal{N}(s(r_{ji}))\]

\(\mathcal{G}^i_< \in \mathbb{R}^{N \times M_2}\) takes first \(M_2\) columns of \(\mathcal{G}^i\). The equation of embedding network \(\mathcal{N}\) can be found at deepmd.utils.network.embedding_net().

Parameters
rcut

The cut-off radius \(r_c\)

rcut_smth

From where the environment matrix should be smoothed \(r_s\)

sellist[int]

sel[i] specifies the maxmum number of type i atoms in the cut-off radius

neuronlist[int]

Number of neurons in each hidden layers of the embedding net \(\mathcal{N}\)

axis_neuron

Number of the axis neuron \(M_2\) (number of columns of the sub-matrix of the embedding matrix)

resnet_dt

Time-step dt in the resnet construction: y = x + dt * phi (Wx + b)

trainable

If the weights of embedding net are trainable.

type_one_side

Try to build N_types embedding nets. Otherwise, building N_types^2 embedding nets

exclude_typesList[List[int]]

The excluded pairs of types which have no interaction with each other. For example, [[0, 1]] means no interaction between type 0 and type 1.

set_davg_zero

Set the shift of embedding net input to zero.

activation_function

The activation function in the embedding net. Supported options are “relu”, “relu6”, “softplus”, “sigmoid”, “tanh”, “gelu”, “gelu_tf”, “None”, “none”.

precision

The precision of the embedding net parameters. Supported options are “default”, “float16”, “float32”, “float64”, “bfloat16”.

multi_task

If the model has multi fitting nets to train.

spin

The deepspin object.

References

1

Linfeng Zhang, Jiequn Han, Han Wang, Wissam A. Saidi, Roberto Car, and E. Weinan. 2018. End-to-end symmetry preserving inter-atomic potential energy model for finite and extended systems. In Proceedings of the 32nd International Conference on Neural Information Processing Systems (NIPS’18). Curran Associates Inc., Red Hook, NY, USA, 4441-4451.

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(coord_ext, atype_ext, nlist)

Compute the descriptor.

cal_g

deserialize

serialize

cal_g(ss, ll)[source]
call(coord_ext, atype_ext, nlist)[source]

Compute the descriptor.

Parameters
coord_ext

The extended coordinates of atoms. shape: nf x (nallx3)

atype_ext

The extended aotm types. shape: nf x nall

nlist

The neighbor list. shape: nf x nloc x nnei

Returns
descriptor

The descriptor. shape: nf x nloc x ng x axis_neuron

classmethod deserialize(data: dict) DescrptSeA[source]
serialize() dict[source]
deepmd_utils.model_format.EmbeddingNet

alias of EN

class deepmd_utils.model_format.EnvMat(rcut, rcut_smth)[source]

Bases: NativeOP

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(coord_ext, atype_ext, nlist[, davg, dstd])

Compute the environment matrix.

deserialize

serialize

call(coord_ext: ndarray, atype_ext: ndarray, nlist: ndarray, davg: Optional[ndarray] = None, dstd: Optional[ndarray] = None) ndarray[source]

Compute the environment matrix.

Parameters
nlist

The neighbor list. shape: nf x nloc x nnei

coord_ext

The extended coordinates of atoms. shape: nf x (nallx3)

atype_ext

The extended aotm types. shape: nf x nall

davg

The data avg. shape: nt x nnei x 4

dstd

The inverse of data std. shape: nt x nnei x 4

Returns
env_mat

The environment matrix. shape: nf x nloc x nnei x 4

switch

The value of switch function. shape: nf x nloc x nnei

classmethod deserialize(data: dict) EnvMat[source]
serialize() dict[source]
deepmd_utils.model_format.FittingNet

alias of FN

class deepmd_utils.model_format.FittingOutputDef(var_defs: List[OutputVariableDef])[source]

Bases: object

Defines the shapes and other properties of the fitting network outputs.

It is assume that the fitting network output variables for each local atom. This class defines all the outputs.

Parameters
var_defs

List of output variable definitions.

Methods

get_data

keys

get_data() Dict[str, OutputVariableDef][source]
keys()[source]
class deepmd_utils.model_format.ModelOutputDef(fit_defs: FittingOutputDef)[source]

Bases: object

Defines the shapes and other properties of the model outputs.

The model reduce and differentiate fitting outputs if applicable. If a variable is named by foo, then the reduced variable is called foo_redu, the derivative w.r.t. coordinates is called foo_derv_r and the derivative w.r.t. cell is called foo_derv_c.

Parameters
fit_defs

Definition for the fitting net output

Methods

get_data

keys

keys_derv_c

keys_derv_r

keys_outp

keys_redu

get_data(key: str) Dict[str, OutputVariableDef][source]
keys()[source]
keys_derv_c()[source]
keys_derv_r()[source]
keys_outp()[source]
keys_redu()[source]
class deepmd_utils.model_format.NativeLayer(num_in, num_out, bias: bool = True, use_timestep: bool = False, activation_function: Optional[str] = None, resnet: bool = False, precision: str = 'float64')[source]

Bases: NativeOP

Native representation of a layer.

Parameters
wnp.ndarray, optional

The weights of the layer.

bnp.ndarray, optional

The biases of the layer.

idtnp.ndarray, optional

The identity matrix of the layer.

activation_functionstr, optional

The activation function of the layer.

resnetbool, optional

Whether the layer is a residual layer.

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(x)

Forward pass.

deserialize(data)

Deserialize the layer from a dict.

serialize()

Serialize the layer to a dict.

check_shape_consistency

check_type_consistency

dim_in

dim_out

call(x: ndarray) ndarray[source]

Forward pass.

Parameters
xnp.ndarray

The input.

Returns
np.ndarray

The output.

check_shape_consistency()[source]
check_type_consistency()[source]
classmethod deserialize(data: dict) NativeLayer[source]

Deserialize the layer from a dict.

Parameters
datadict

The dict to deserialize from.

dim_in() int[source]
dim_out() int[source]
serialize() dict[source]

Serialize the layer to a dict.

Returns
dict

The serialized layer.

deepmd_utils.model_format.NativeNet

alias of NN

class deepmd_utils.model_format.NativeOP[source]

Bases: ABC

The unit operation of a native model.

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(*args, **kwargs)

Forward pass in NumPy implementation.

call(*args, **kwargs)[source]

Forward pass in NumPy implementation.

class deepmd_utils.model_format.NetworkCollection(ndim: int, ntypes: int, network_type: str = 'network', networks: ~typing.List[~typing.Union[~deepmd_utils.model_format.network.make_multilayer_network.<locals>.NN, dict]] = [])[source]

Bases: object

A collection of networks for multiple elements.

The number of dimesions for types might be 0, 1, or 2. - 0: embedding or fitting with type embedding, in () - 1: embedding with type_one_side, or fitting, in (type_i) - 2: embedding without type_one_side, in (type_i, type_j)

Parameters
ndimint

The number of dimensions.

network_typestr, optional

The type of the network.

networksdict, optional

The networks to initialize with.

Methods

check_completeness()

Check whether the collection is complete.

deserialize(data)

Deserialize the networks from a dict.

serialize()

Serialize the networks to a dict.

NETWORK_TYPE_MAP: ClassVar[Dict[str, type]] = {'embedding_network': <class 'deepmd_utils.model_format.network.make_embedding_network.<locals>.EN'>, 'fitting_network': <class 'deepmd_utils.model_format.network.make_fitting_network.<locals>.FN'>, 'network': <class 'deepmd_utils.model_format.network.make_multilayer_network.<locals>.NN'>}
check_completeness()[source]

Check whether the collection is complete.

Raises
RuntimeError

If the collection is incomplete.

classmethod deserialize(data: dict) NetworkCollection[source]

Deserialize the networks from a dict.

Parameters
datadict

The dict to deserialize from.

serialize() dict[source]

Serialize the networks to a dict.

Returns
dict

The serialized networks.

class deepmd_utils.model_format.OutputVariableDef(name: str, shape: List[int], reduciable: bool = False, differentiable: bool = False, atomic: bool = True)[source]

Bases: object

Defines the shape and other properties of the one output variable.

It is assume that the fitting network output variables for each local atom. This class defines one output variable, including its name, shape, reducibility and differentiability.

Parameters
name

Name of the output variable. Notice that the xxxx_redu, xxxx_derv_c, xxxx_derv_r are reserved names that should not be used to define variables.

shape

The shape of the variable. e.g. energy should be [1], dipole should be [3], polarizabilty should be [3,3].

reduciable

If the variable is reduced.

differentiable

If the variable is differentiated with respect to coordinates of atoms and cell tensor (pbc case). Only reduciable variable are differentiable.

deepmd_utils.model_format.fitting_check_output(cls)[source]

Check if the output of the Fitting is consistent with the definition.

Two methods are assumed to be provided by the Fitting: 1. Fitting.output_def that gives the output definition. 2. Fitting.__call__ defines the forward path of the fitting.

deepmd_utils.model_format.get_deriv_name(name: str) Tuple[str, str][source]
deepmd_utils.model_format.get_reduce_name(name: str) str[source]
deepmd_utils.model_format.load_dp_model(filename: str) dict[source]

Load a DP model from a file in the native format.

Parameters
filenamestr

The filename to load from.

Returns
dict

The loaded model dict, including meta information.

deepmd_utils.model_format.make_embedding_network(T_Network, T_NetworkLayer)[source]
deepmd_utils.model_format.make_fitting_network(T_EmbeddingNet, T_Network, T_NetworkLayer)[source]
deepmd_utils.model_format.make_multilayer_network(T_NetworkLayer, ModuleBase)[source]
deepmd_utils.model_format.model_check_output(cls)[source]

Check if the output of the Model is consistent with the definition.

Two methods are assumed to be provided by the Model: 1. Model.output_def that gives the output definition. 2. Model.__call__ that defines the forward path of the model.

deepmd_utils.model_format.save_dp_model(filename: str, model_dict: dict, extra_info: Optional[dict] = None)[source]

Save a DP model to a file in the native format.

Parameters
filenamestr

The filename to save to.

model_dictdict

The model dict to save.

extra_infodict, optional

Extra meta information to save.

deepmd_utils.model_format.traverse_model_dict(model_obj, callback: callable, is_variable: bool = False)[source]

Traverse a model dict and call callback on each variable.

Parameters
model_objobject

The model object to traverse.

callbackcallable()

The callback function to call on each variable.

is_variablebool, optional

Whether the current node is a variable.

Returns
object

The model object after traversing.

Submodules

deepmd_utils.model_format.common module

class deepmd_utils.model_format.common.NativeOP[source]

Bases: ABC

The unit operation of a native model.

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(*args, **kwargs)

Forward pass in NumPy implementation.

call(*args, **kwargs)[source]

Forward pass in NumPy implementation.

deepmd_utils.model_format.env_mat module

class deepmd_utils.model_format.env_mat.EnvMat(rcut, rcut_smth)[source]

Bases: NativeOP

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(coord_ext, atype_ext, nlist[, davg, dstd])

Compute the environment matrix.

deserialize

serialize

call(coord_ext: ndarray, atype_ext: ndarray, nlist: ndarray, davg: Optional[ndarray] = None, dstd: Optional[ndarray] = None) ndarray[source]

Compute the environment matrix.

Parameters
nlist

The neighbor list. shape: nf x nloc x nnei

coord_ext

The extended coordinates of atoms. shape: nf x (nallx3)

atype_ext

The extended aotm types. shape: nf x nall

davg

The data avg. shape: nt x nnei x 4

dstd

The inverse of data std. shape: nt x nnei x 4

Returns
env_mat

The environment matrix. shape: nf x nloc x nnei x 4

switch

The value of switch function. shape: nf x nloc x nnei

classmethod deserialize(data: dict) EnvMat[source]
serialize() dict[source]
deepmd_utils.model_format.env_mat.compute_smooth_weight(distance: ndarray, rmin: float, rmax: float)[source]

Compute smooth weight for descriptor elements.

deepmd_utils.model_format.network module

Native DP model format for multiple backends.

See issue #2982 for more information.

class deepmd_utils.model_format.network.Counter[source]

Bases: object

A callable counter.

Examples

>>> counter = Counter()
>>> counter()
0
>>> counter()
1

Methods

__call__()

Call self as a function.

deepmd_utils.model_format.network.EmbeddingNet

alias of EN

deepmd_utils.model_format.network.FittingNet

alias of FN

class deepmd_utils.model_format.network.NativeLayer(num_in, num_out, bias: bool = True, use_timestep: bool = False, activation_function: Optional[str] = None, resnet: bool = False, precision: str = 'float64')[source]

Bases: NativeOP

Native representation of a layer.

Parameters
wnp.ndarray, optional

The weights of the layer.

bnp.ndarray, optional

The biases of the layer.

idtnp.ndarray, optional

The identity matrix of the layer.

activation_functionstr, optional

The activation function of the layer.

resnetbool, optional

Whether the layer is a residual layer.

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(x)

Forward pass.

deserialize(data)

Deserialize the layer from a dict.

serialize()

Serialize the layer to a dict.

check_shape_consistency

check_type_consistency

dim_in

dim_out

call(x: ndarray) ndarray[source]

Forward pass.

Parameters
xnp.ndarray

The input.

Returns
np.ndarray

The output.

check_shape_consistency()[source]
check_type_consistency()[source]
classmethod deserialize(data: dict) NativeLayer[source]

Deserialize the layer from a dict.

Parameters
datadict

The dict to deserialize from.

dim_in() int[source]
dim_out() int[source]
serialize() dict[source]

Serialize the layer to a dict.

Returns
dict

The serialized layer.

deepmd_utils.model_format.network.NativeNet

alias of NN

class deepmd_utils.model_format.network.NetworkCollection(ndim: int, ntypes: int, network_type: str = 'network', networks: ~typing.List[~typing.Union[~deepmd_utils.model_format.network.make_multilayer_network.<locals>.NN, dict]] = [])[source]

Bases: object

A collection of networks for multiple elements.

The number of dimesions for types might be 0, 1, or 2. - 0: embedding or fitting with type embedding, in () - 1: embedding with type_one_side, or fitting, in (type_i) - 2: embedding without type_one_side, in (type_i, type_j)

Parameters
ndimint

The number of dimensions.

network_typestr, optional

The type of the network.

networksdict, optional

The networks to initialize with.

Methods

check_completeness()

Check whether the collection is complete.

deserialize(data)

Deserialize the networks from a dict.

serialize()

Serialize the networks to a dict.

NETWORK_TYPE_MAP: ClassVar[Dict[str, type]] = {'embedding_network': <class 'deepmd_utils.model_format.network.make_embedding_network.<locals>.EN'>, 'fitting_network': <class 'deepmd_utils.model_format.network.make_fitting_network.<locals>.FN'>, 'network': <class 'deepmd_utils.model_format.network.make_multilayer_network.<locals>.NN'>}
check_completeness()[source]

Check whether the collection is complete.

Raises
RuntimeError

If the collection is incomplete.

classmethod deserialize(data: dict) NetworkCollection[source]

Deserialize the networks from a dict.

Parameters
datadict

The dict to deserialize from.

serialize() dict[source]

Serialize the networks to a dict.

Returns
dict

The serialized networks.

deepmd_utils.model_format.network.load_dp_model(filename: str) dict[source]

Load a DP model from a file in the native format.

Parameters
filenamestr

The filename to load from.

Returns
dict

The loaded model dict, including meta information.

deepmd_utils.model_format.network.make_embedding_network(T_Network, T_NetworkLayer)[source]
deepmd_utils.model_format.network.make_fitting_network(T_EmbeddingNet, T_Network, T_NetworkLayer)[source]
deepmd_utils.model_format.network.make_multilayer_network(T_NetworkLayer, ModuleBase)[source]
deepmd_utils.model_format.network.save_dp_model(filename: str, model_dict: dict, extra_info: Optional[dict] = None)[source]

Save a DP model to a file in the native format.

Parameters
filenamestr

The filename to save to.

model_dictdict

The model dict to save.

extra_infodict, optional

Extra meta information to save.

deepmd_utils.model_format.network.traverse_model_dict(model_obj, callback: callable, is_variable: bool = False)[source]

Traverse a model dict and call callback on each variable.

Parameters
model_objobject

The model object to traverse.

callbackcallable()

The callback function to call on each variable.

is_variablebool, optional

Whether the current node is a variable.

Returns
object

The model object after traversing.

deepmd_utils.model_format.output_def module

class deepmd_utils.model_format.output_def.FittingOutputDef(var_defs: List[OutputVariableDef])[source]

Bases: object

Defines the shapes and other properties of the fitting network outputs.

It is assume that the fitting network output variables for each local atom. This class defines all the outputs.

Parameters
var_defs

List of output variable definitions.

Methods

get_data

keys

get_data() Dict[str, OutputVariableDef][source]
keys()[source]
class deepmd_utils.model_format.output_def.ModelOutputDef(fit_defs: FittingOutputDef)[source]

Bases: object

Defines the shapes and other properties of the model outputs.

The model reduce and differentiate fitting outputs if applicable. If a variable is named by foo, then the reduced variable is called foo_redu, the derivative w.r.t. coordinates is called foo_derv_r and the derivative w.r.t. cell is called foo_derv_c.

Parameters
fit_defs

Definition for the fitting net output

Methods

get_data

keys

keys_derv_c

keys_derv_r

keys_outp

keys_redu

get_data(key: str) Dict[str, OutputVariableDef][source]
keys()[source]
keys_derv_c()[source]
keys_derv_r()[source]
keys_outp()[source]
keys_redu()[source]
class deepmd_utils.model_format.output_def.OutputVariableDef(name: str, shape: List[int], reduciable: bool = False, differentiable: bool = False, atomic: bool = True)[source]

Bases: object

Defines the shape and other properties of the one output variable.

It is assume that the fitting network output variables for each local atom. This class defines one output variable, including its name, shape, reducibility and differentiability.

Parameters
name

Name of the output variable. Notice that the xxxx_redu, xxxx_derv_c, xxxx_derv_r are reserved names that should not be used to define variables.

shape

The shape of the variable. e.g. energy should be [1], dipole should be [3], polarizabilty should be [3,3].

reduciable

If the variable is reduced.

differentiable

If the variable is differentiated with respect to coordinates of atoms and cell tensor (pbc case). Only reduciable variable are differentiable.

deepmd_utils.model_format.output_def.check_shape(shape: List[int], def_shape: List[int])[source]

Check if the shape satisfies the defined shape.

deepmd_utils.model_format.output_def.check_var(var, var_def)[source]
deepmd_utils.model_format.output_def.do_derivative(def_outp: FittingOutputDef) Tuple[Dict[str, OutputVariableDef], Dict[str, OutputVariableDef]][source]
deepmd_utils.model_format.output_def.do_reduce(def_outp: FittingOutputDef) Dict[str, OutputVariableDef][source]
deepmd_utils.model_format.output_def.fitting_check_output(cls)[source]

Check if the output of the Fitting is consistent with the definition.

Two methods are assumed to be provided by the Fitting: 1. Fitting.output_def that gives the output definition. 2. Fitting.__call__ defines the forward path of the fitting.

deepmd_utils.model_format.output_def.get_deriv_name(name: str) Tuple[str, str][source]
deepmd_utils.model_format.output_def.get_reduce_name(name: str) str[source]
deepmd_utils.model_format.output_def.model_check_output(cls)[source]

Check if the output of the Model is consistent with the definition.

Two methods are assumed to be provided by the Model: 1. Model.output_def that gives the output definition. 2. Model.__call__ that defines the forward path of the model.

deepmd_utils.model_format.se_e2_a module

class deepmd_utils.model_format.se_e2_a.DescrptSeA(rcut: float, rcut_smth: float, sel: List[int], neuron: List[int] = [24, 48, 96], axis_neuron: int = 8, resnet_dt: bool = False, trainable: bool = True, type_one_side: bool = True, exclude_types: List[List[int]] = [], set_davg_zero: bool = False, activation_function: str = 'tanh', precision: str = 'float64', spin: Optional[Any] = None)[source]

Bases: NativeOP

DeepPot-SE constructed from all information (both angular and radial) of atomic configurations. The embedding takes the distance between atoms as input.

The descriptor \(\mathcal{D}^i \in \mathcal{R}^{M_1 \times M_2}\) is given by [1]

\[\mathcal{D}^i = (\mathcal{G}^i)^T \mathcal{R}^i (\mathcal{R}^i)^T \mathcal{G}^i_<\]

where \(\mathcal{R}^i \in \mathbb{R}^{N \times 4}\) is the coordinate matrix, and each row of \(\mathcal{R}^i\) can be constructed as follows

\[(\mathcal{R}^i)_j = [ \begin{array}{c} s(r_{ji}) & \frac{s(r_{ji})x_{ji}}{r_{ji}} & \frac{s(r_{ji})y_{ji}}{r_{ji}} & \frac{s(r_{ji})z_{ji}}{r_{ji}} \end{array} ]\]

where \(\mathbf{R}_{ji}=\mathbf{R}_j-\mathbf{R}_i = (x_{ji}, y_{ji}, z_{ji})\) is the relative coordinate and \(r_{ji}=\lVert \mathbf{R}_{ji} \lVert\) is its norm. The switching function \(s(r)\) is defined as:

\[\begin{split}s(r)= \begin{cases} \frac{1}{r}, & r<r_s \\ \frac{1}{r} \{ {(\frac{r - r_s}{ r_c - r_s})}^3 (-6 {(\frac{r - r_s}{ r_c - r_s})}^2 +15 \frac{r - r_s}{ r_c - r_s} -10) +1 \}, & r_s \leq r<r_c \\ 0, & r \geq r_c \end{cases}\end{split}\]

Each row of the embedding matrix \(\mathcal{G}^i \in \mathbb{R}^{N \times M_1}\) consists of outputs of a embedding network \(\mathcal{N}\) of \(s(r_{ji})\):

\[(\mathcal{G}^i)_j = \mathcal{N}(s(r_{ji}))\]

\(\mathcal{G}^i_< \in \mathbb{R}^{N \times M_2}\) takes first \(M_2\) columns of \(\mathcal{G}^i\). The equation of embedding network \(\mathcal{N}\) can be found at deepmd.utils.network.embedding_net().

Parameters
rcut

The cut-off radius \(r_c\)

rcut_smth

From where the environment matrix should be smoothed \(r_s\)

sellist[int]

sel[i] specifies the maxmum number of type i atoms in the cut-off radius

neuronlist[int]

Number of neurons in each hidden layers of the embedding net \(\mathcal{N}\)

axis_neuron

Number of the axis neuron \(M_2\) (number of columns of the sub-matrix of the embedding matrix)

resnet_dt

Time-step dt in the resnet construction: y = x + dt * phi (Wx + b)

trainable

If the weights of embedding net are trainable.

type_one_side

Try to build N_types embedding nets. Otherwise, building N_types^2 embedding nets

exclude_typesList[List[int]]

The excluded pairs of types which have no interaction with each other. For example, [[0, 1]] means no interaction between type 0 and type 1.

set_davg_zero

Set the shift of embedding net input to zero.

activation_function

The activation function in the embedding net. Supported options are “relu”, “relu6”, “softplus”, “sigmoid”, “tanh”, “gelu”, “gelu_tf”, “None”, “none”.

precision

The precision of the embedding net parameters. Supported options are “default”, “float16”, “float32”, “float64”, “bfloat16”.

multi_task

If the model has multi fitting nets to train.

spin

The deepspin object.

References

1

Linfeng Zhang, Jiequn Han, Han Wang, Wissam A. Saidi, Roberto Car, and E. Weinan. 2018. End-to-end symmetry preserving inter-atomic potential energy model for finite and extended systems. In Proceedings of the 32nd International Conference on Neural Information Processing Systems (NIPS’18). Curran Associates Inc., Red Hook, NY, USA, 4441-4451.

Methods

__call__(*args, **kwargs)

Forward pass in NumPy implementation.

call(coord_ext, atype_ext, nlist)

Compute the descriptor.

cal_g

deserialize

serialize

cal_g(ss, ll)[source]
call(coord_ext, atype_ext, nlist)[source]

Compute the descriptor.

Parameters
coord_ext

The extended coordinates of atoms. shape: nf x (nallx3)

atype_ext

The extended aotm types. shape: nf x nall

nlist

The neighbor list. shape: nf x nloc x nnei

Returns
descriptor

The descriptor. shape: nf x nloc x ng x axis_neuron

classmethod deserialize(data: dict) DescrptSeA[source]
serialize() dict[source]