You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

function.proto 4.1 kB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100
  1. syntax = "proto3";
  2. package domi.tensorflow;
  3. option cc_enable_arenas = true;
  4. option java_outer_classname = "FunctionProtos";
  5. option java_multiple_files = true;
  6. option java_package = "org.tensorflow.framework";
  7. import "attr_value.proto";
  8. import "node_def.proto";
  9. import "op_def.proto";
  10. // A library is a set of named functions.
  11. message FunctionDefLibrary {
  12. repeated FunctionDef function = 1;
  13. repeated GradientDef gradient = 2;
  14. }
  15. // A function can be instantiated when the runtime can bind every attr
  16. // with a value. When a GraphDef has a call to a function, it must
  17. // have binding for every attr defined in the signature.
  18. // * device spec, etc.
  19. message FunctionDef {
  20. // The definition of the function's name, arguments, return values,
  21. // attrs etc.
  22. OpDef signature = 1;
  23. // Attributes specific to this function definition.
  24. map<string, AttrValue> attr = 5;
  25. // NOTE: field id 2 deleted on Jan 11, 2017, GraphDef version 21.
  26. reserved 2;
  27. // In both of the following fields, there is the need to specify an
  28. // output that is used as either the input to another node (in
  29. // `node_def`) or as a return value of the function (in `ret`).
  30. // Unlike the NodeDefs in GraphDef, we need to be able to specify a
  31. // list in some cases (instead of just single outputs). Also, we
  32. // need to be able to deal with lists of unknown length (so the
  33. // output index may not be known at function definition time). So
  34. // we use the following format instead:
  35. // * "fun_in" where "fun_in" is the name of a function input arg in
  36. // the `signature` field above. This represents that input, whether
  37. // it is a single tensor or a list.
  38. // * "fun_in:0" gives the first element of a function input arg (a
  39. // non-list input is considered a list of length 1 for these
  40. // purposes).
  41. // * "node:out" where "node" is the name of a node in `node_def` and
  42. // "out" is the name one of its op's output arguments (the name
  43. // comes from the OpDef of the node's op). This represents that
  44. // node's output, whether it is a single tensor or a list.
  45. // Note: We enforce that an op's output arguments are never
  46. // renamed in the backwards-compatibility test.
  47. // * "node:out:0" gives the first element of a node output arg (a
  48. // non-list output is considered a list of length 1 for these
  49. // purposes).
  50. //
  51. // NOT CURRENTLY SUPPORTED (but may be in the future):
  52. // * "node:out:-1" gives last element in a node output list
  53. // * "node:out:1:" gives a list with all but the first element in a
  54. // node output list
  55. // * "node:out::-1" gives a list with all but the last element in a
  56. // node output list
  57. // The body of the function. Unlike the NodeDefs in a GraphDef, attrs
  58. // may have values of type `placeholder` and the `input` field uses
  59. // the "output" format above.
  60. // By convention, "op" in node_def is resolved by consulting with a
  61. // user-defined library first. If not resolved, "func" is assumed to
  62. // be a builtin op.
  63. repeated NodeDef node_def = 3;
  64. // A mapping from the output arg names from `signature` to the
  65. // outputs from `node_def` that should be returned by the function.
  66. map<string, string> ret = 4;
  67. }
  68. // GradientDef defines the gradient function of a function defined in
  69. // a function library.
  70. //
  71. // A gradient function g (specified by gradient_func) for a function f
  72. // (specified by function_name) must follow the following:
  73. //
  74. // The function 'f' must be a numerical function which takes N inputs
  75. // and produces M outputs. Its gradient function 'g', which is a
  76. // function taking N + M inputs and produces N outputs.
  77. //
  78. // I.e. if we have
  79. // (y1, y2, ..., y_M) = f(x1, x2, ..., x_N),
  80. // then, g is
  81. // (dL/dx1, dL/dx2, ..., dL/dx_N) = g(x1, x2, ..., x_N,
  82. // dL/dy1, dL/dy2, ..., dL/dy_M),
  83. // where L is a scalar-value function of (x1, x2, ..., xN) (e.g., the
  84. // loss function). dL/dx_i is the partial derivative of L with respect
  85. // to x_i.
  86. message GradientDef {
  87. string function_name = 1; // The function name.
  88. string gradient_func = 2; // The gradient function's name.
  89. }

图引擎模块(GE)是MindSpore的一个子模块,其代码由C++实现,位于前端模块ME和底层硬件之间,起到承接作用。图引擎模块以ME下发的图作为输入,然后进行一系列的深度图优化操作,最后输出一张可以在底层硬件上高效运行的图。GE针对昇腾AI处理器的硬件结构特点,做了特定的优化工作,以此来充分发挥出昇腾AI处理器的强大算力。在进行模型训练/推理时,GE会被自动调用而用户并不感知。GE主要由GE API和GE Core两部分组成,详细的架构图如下所示