You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

view.go 29 kB

3 years ago
Git LFS support v2 (#122) * Import github.com/git-lfs/lfs-test-server as lfs module base Imported commit is 3968aac269a77b73924649b9412ae03f7ccd3198 Removed: Dockerfile CONTRIBUTING.md mgmt* script/ vendor/ kvlogger.go .dockerignore .gitignore README.md * Remove config, add JWT support from github.com/mgit-at/lfs-test-server Imported commit f0cdcc5a01599c5a955dc1bbf683bb4acecdba83 * Add LFS settings * Add LFS meta object model * Add LFS routes and initialization * Import github.com/dgrijalva/jwt-go into vendor/ * Adapt LFS module: handlers, routing, meta store * Move LFS routes to /user/repo/info/lfs/* * Add request header checks to LFS BatchHandler / PostHandler * Implement LFS basic authentication * Rework JWT secret generation / load * Implement LFS SSH token authentication with JWT Specification: https://github.com/github/git-lfs/tree/master/docs/api * Integrate LFS settings into install process * Remove LFS objects when repository is deleted Only removes objects from content store when deleted repo is the only referencing repository * Make LFS module stateless Fixes bug where LFS would not work after installation without restarting Gitea * Change 500 'Internal Server Error' to 400 'Bad Request' * Change sql query to xorm call * Remove unneeded type from LFS module * Change internal imports to code.gitea.io/gitea/ * Add Gitea authors copyright * Change basic auth realm to "gitea-lfs" * Add unique indexes to LFS model * Use xorm count function in LFS check on repository delete * Return io.ReadCloser from content store and close after usage * Add LFS info to runWeb() * Export LFS content store base path * LFS file download from UI * Work around git-lfs client issue with unauthenticated requests Returning a dummy Authorization header for unauthenticated requests lets git-lfs client skip asking for auth credentials See: https://github.com/github/git-lfs/issues/1088 * Fix unauthenticated UI downloads from public repositories * Authentication check order, Finish LFS file view logic * Ignore LFS hooks if installed for current OS user Fixes Gitea UI actions for repositories tracking LFS files. Checks for minimum needed git version by parsing the semantic version string. * Hide LFS metafile diff from commit view, marking as binary * Show LFS notice if file in commit view is tracked * Add notbefore/nbf JWT claim * Correct lint suggestions - comments for structs and functions - Add comments to LFS model - Function comment for GetRandomBytesAsBase64 - LFS server function comments and lint variable suggestion * Move secret generation code out of conditional Ensures no LFS code may run with an empty secret * Do not hand out JWT tokens if LFS server support is disabled
8 years ago
3 years ago
Improve listing performance by using go-git (#6478) * Use go-git for tree reading and commit info lookup. Signed-off-by: Filip Navara <navara@emclient.com> * Use TreeEntry.IsRegular() instead of ObjectType that was removed. Signed-off-by: Filip Navara <navara@emclient.com> * Use the treePath to optimize commit info search. Signed-off-by: Filip Navara <navara@emclient.com> * Extract the latest commit at treePath along with the other commits. Signed-off-by: Filip Navara <navara@emclient.com> * Fix listing commit info for a directory that was created in one commit and never modified after. Signed-off-by: Filip Navara <navara@emclient.com> * Avoid nearly all external 'git' invocations when doing directory listing (.editorconfig code path is still hit). Signed-off-by: Filip Navara <navara@emclient.com> * Use go-git for reading blobs. Signed-off-by: Filip Navara <navara@emclient.com> * Make SHA1 type alias for plumbing.Hash in go-git. Signed-off-by: Filip Navara <navara@emclient.com> * Make Signature type alias for object.Signature in go-git. Signed-off-by: Filip Navara <navara@emclient.com> * Fix GetCommitsInfo for repository with only one commit. Signed-off-by: Filip Navara <navara@emclient.com> * Fix PGP signature verification. Signed-off-by: Filip Navara <navara@emclient.com> * Fix issues with walking commit graph across merges. Signed-off-by: Filip Navara <navara@emclient.com> * Fix typo in condition. Signed-off-by: Filip Navara <navara@emclient.com> * Speed up loading branch list by keeping the repository reference (and thus all the loaded packfile indexes). Signed-off-by: Filip Navara <navara@emclient.com> * Fix lising submodules. Signed-off-by: Filip Navara <navara@emclient.com> * Fix build Signed-off-by: Filip Navara <navara@emclient.com> * Add back commit cache because of name-rev Signed-off-by: Filip Navara <navara@emclient.com> * Fix tests Signed-off-by: Filip Navara <navara@emclient.com> * Fix code style * Fix spelling * Address PR feedback Signed-off-by: Filip Navara <navara@emclient.com> * Update vendor module list Signed-off-by: Filip Navara <navara@emclient.com> * Fix getting trees by commit id Signed-off-by: Filip Navara <navara@emclient.com> * Fix remaining unit test failures * Fix GetTreeBySHA * Avoid running `git name-rev` if not necessary Signed-off-by: Filip Navara <navara@emclient.com> * Move Branch code to git module * Clean up GPG signature verification and fix it for tagged commits * Address PR feedback (import formatting, copyright headers) * Make blob lookup by SHA working * Update tests to use public API * Allow getting content from any type of object through the blob interface * Change test to actually expect the object content that is in the GIT repository * Change one more test to actually expect the object content that is in the GIT repository * Add comments
6 years ago
3 years ago
3 years ago
Better logging (#6038) (#6095) * Panic don't fatal on create new logger Fixes #5854 Signed-off-by: Andrew Thornton <art27@cantab.net> * partial broken * Update the logging infrastrcture Signed-off-by: Andrew Thornton <art27@cantab.net> * Reset the skip levels for Fatal and Error Signed-off-by: Andrew Thornton <art27@cantab.net> * broken ncsa * More log.Error fixes Signed-off-by: Andrew Thornton <art27@cantab.net> * Remove nal * set log-levels to lowercase * Make console_test test all levels * switch to lowercased levels * OK now working * Fix vetting issues * Fix lint * Fix tests * change default logging to match current gitea * Improve log testing Signed-off-by: Andrew Thornton <art27@cantab.net> * reset error skip levels to 0 * Update documentation and access logger configuration * Redirect the router log back to gitea if redirect macaron log but also allow setting the log level - i.e. TRACE * Fix broken level caching * Refactor the router log * Add Router logger * Add colorizing options * Adjust router colors * Only create logger if they will be used * update app.ini.sample * rename Attribute ColorAttribute * Change from white to green for function * Set fatal/error levels * Restore initial trace logger * Fix Trace arguments in modules/auth/auth.go * Properly handle XORMLogger * Improve admin/config page * fix fmt * Add auto-compression of old logs * Update error log levels * Remove the unnecessary skip argument from Error, Fatal and Critical * Add stacktrace support * Fix tests * Remove x/sync from vendors? * Add stderr option to console logger * Use filepath.ToSlash to protect against Windows in tests * Remove prefixed underscores from names in colors.go * Remove not implemented database logger This was removed from Gogs on 4 Mar 2016 but left in the configuration since then. * Ensure that log paths are relative to ROOT_PATH * use path.Join * rename jsonConfig to logConfig * Rename "config" to "jsonConfig" to make it clearer * Requested changes * Requested changes: XormLogger * Try to color the windows terminal If successful default to colorizing the console logs * fixup * Colorize initially too * update vendor * Colorize logs on default and remove if this is not a colorizing logger * Fix documentation * fix test * Use go-isatty to detect if on windows we are on msys or cygwin * Fix spelling mistake * Add missing vendors * More changes * Rationalise the ANSI writer protection * Adjust colors on advice from @0x5c * Make Flags a comma separated list * Move to use the windows constant for ENABLE_VIRTUAL_TERMINAL_PROCESSING * Ensure matching is done on the non-colored message - to simpify EXPRESSION
6 years ago
10 years ago
Git LFS support v2 (#122) * Import github.com/git-lfs/lfs-test-server as lfs module base Imported commit is 3968aac269a77b73924649b9412ae03f7ccd3198 Removed: Dockerfile CONTRIBUTING.md mgmt* script/ vendor/ kvlogger.go .dockerignore .gitignore README.md * Remove config, add JWT support from github.com/mgit-at/lfs-test-server Imported commit f0cdcc5a01599c5a955dc1bbf683bb4acecdba83 * Add LFS settings * Add LFS meta object model * Add LFS routes and initialization * Import github.com/dgrijalva/jwt-go into vendor/ * Adapt LFS module: handlers, routing, meta store * Move LFS routes to /user/repo/info/lfs/* * Add request header checks to LFS BatchHandler / PostHandler * Implement LFS basic authentication * Rework JWT secret generation / load * Implement LFS SSH token authentication with JWT Specification: https://github.com/github/git-lfs/tree/master/docs/api * Integrate LFS settings into install process * Remove LFS objects when repository is deleted Only removes objects from content store when deleted repo is the only referencing repository * Make LFS module stateless Fixes bug where LFS would not work after installation without restarting Gitea * Change 500 'Internal Server Error' to 400 'Bad Request' * Change sql query to xorm call * Remove unneeded type from LFS module * Change internal imports to code.gitea.io/gitea/ * Add Gitea authors copyright * Change basic auth realm to "gitea-lfs" * Add unique indexes to LFS model * Use xorm count function in LFS check on repository delete * Return io.ReadCloser from content store and close after usage * Add LFS info to runWeb() * Export LFS content store base path * LFS file download from UI * Work around git-lfs client issue with unauthenticated requests Returning a dummy Authorization header for unauthenticated requests lets git-lfs client skip asking for auth credentials See: https://github.com/github/git-lfs/issues/1088 * Fix unauthenticated UI downloads from public repositories * Authentication check order, Finish LFS file view logic * Ignore LFS hooks if installed for current OS user Fixes Gitea UI actions for repositories tracking LFS files. Checks for minimum needed git version by parsing the semantic version string. * Hide LFS metafile diff from commit view, marking as binary * Show LFS notice if file in commit view is tracked * Add notbefore/nbf JWT claim * Correct lint suggestions - comments for structs and functions - Add comments to LFS model - Function comment for GetRandomBytesAsBase64 - LFS server function comments and lint variable suggestion * Move secret generation code out of conditional Ensures no LFS code may run with an empty secret * Do not hand out JWT tokens if LFS server support is disabled
8 years ago
Git LFS support v2 (#122) * Import github.com/git-lfs/lfs-test-server as lfs module base Imported commit is 3968aac269a77b73924649b9412ae03f7ccd3198 Removed: Dockerfile CONTRIBUTING.md mgmt* script/ vendor/ kvlogger.go .dockerignore .gitignore README.md * Remove config, add JWT support from github.com/mgit-at/lfs-test-server Imported commit f0cdcc5a01599c5a955dc1bbf683bb4acecdba83 * Add LFS settings * Add LFS meta object model * Add LFS routes and initialization * Import github.com/dgrijalva/jwt-go into vendor/ * Adapt LFS module: handlers, routing, meta store * Move LFS routes to /user/repo/info/lfs/* * Add request header checks to LFS BatchHandler / PostHandler * Implement LFS basic authentication * Rework JWT secret generation / load * Implement LFS SSH token authentication with JWT Specification: https://github.com/github/git-lfs/tree/master/docs/api * Integrate LFS settings into install process * Remove LFS objects when repository is deleted Only removes objects from content store when deleted repo is the only referencing repository * Make LFS module stateless Fixes bug where LFS would not work after installation without restarting Gitea * Change 500 'Internal Server Error' to 400 'Bad Request' * Change sql query to xorm call * Remove unneeded type from LFS module * Change internal imports to code.gitea.io/gitea/ * Add Gitea authors copyright * Change basic auth realm to "gitea-lfs" * Add unique indexes to LFS model * Use xorm count function in LFS check on repository delete * Return io.ReadCloser from content store and close after usage * Add LFS info to runWeb() * Export LFS content store base path * LFS file download from UI * Work around git-lfs client issue with unauthenticated requests Returning a dummy Authorization header for unauthenticated requests lets git-lfs client skip asking for auth credentials See: https://github.com/github/git-lfs/issues/1088 * Fix unauthenticated UI downloads from public repositories * Authentication check order, Finish LFS file view logic * Ignore LFS hooks if installed for current OS user Fixes Gitea UI actions for repositories tracking LFS files. Checks for minimum needed git version by parsing the semantic version string. * Hide LFS metafile diff from commit view, marking as binary * Show LFS notice if file in commit view is tracked * Add notbefore/nbf JWT claim * Correct lint suggestions - comments for structs and functions - Add comments to LFS model - Function comment for GetRandomBytesAsBase64 - LFS server function comments and lint variable suggestion * Move secret generation code out of conditional Ensures no LFS code may run with an empty secret * Do not hand out JWT tokens if LFS server support is disabled
8 years ago
Git LFS support v2 (#122) * Import github.com/git-lfs/lfs-test-server as lfs module base Imported commit is 3968aac269a77b73924649b9412ae03f7ccd3198 Removed: Dockerfile CONTRIBUTING.md mgmt* script/ vendor/ kvlogger.go .dockerignore .gitignore README.md * Remove config, add JWT support from github.com/mgit-at/lfs-test-server Imported commit f0cdcc5a01599c5a955dc1bbf683bb4acecdba83 * Add LFS settings * Add LFS meta object model * Add LFS routes and initialization * Import github.com/dgrijalva/jwt-go into vendor/ * Adapt LFS module: handlers, routing, meta store * Move LFS routes to /user/repo/info/lfs/* * Add request header checks to LFS BatchHandler / PostHandler * Implement LFS basic authentication * Rework JWT secret generation / load * Implement LFS SSH token authentication with JWT Specification: https://github.com/github/git-lfs/tree/master/docs/api * Integrate LFS settings into install process * Remove LFS objects when repository is deleted Only removes objects from content store when deleted repo is the only referencing repository * Make LFS module stateless Fixes bug where LFS would not work after installation without restarting Gitea * Change 500 'Internal Server Error' to 400 'Bad Request' * Change sql query to xorm call * Remove unneeded type from LFS module * Change internal imports to code.gitea.io/gitea/ * Add Gitea authors copyright * Change basic auth realm to "gitea-lfs" * Add unique indexes to LFS model * Use xorm count function in LFS check on repository delete * Return io.ReadCloser from content store and close after usage * Add LFS info to runWeb() * Export LFS content store base path * LFS file download from UI * Work around git-lfs client issue with unauthenticated requests Returning a dummy Authorization header for unauthenticated requests lets git-lfs client skip asking for auth credentials See: https://github.com/github/git-lfs/issues/1088 * Fix unauthenticated UI downloads from public repositories * Authentication check order, Finish LFS file view logic * Ignore LFS hooks if installed for current OS user Fixes Gitea UI actions for repositories tracking LFS files. Checks for minimum needed git version by parsing the semantic version string. * Hide LFS metafile diff from commit view, marking as binary * Show LFS notice if file in commit view is tracked * Add notbefore/nbf JWT claim * Correct lint suggestions - comments for structs and functions - Add comments to LFS model - Function comment for GetRandomBytesAsBase64 - LFS server function comments and lint variable suggestion * Move secret generation code out of conditional Ensures no LFS code may run with an empty secret * Do not hand out JWT tokens if LFS server support is disabled
8 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
API add/generalize pagination (#9452) * paginate results * fixed deadlock * prevented breaking change * updated swagger * go fmt * fixed find topic * go mod tidy * go mod vendor with go1.13.5 * fixed repo find topics * fixed unit test * added Limit method to Engine struct; use engine variable when provided; fixed gitignore * use ItemsPerPage for default pagesize; fix GetWatchers, getOrgUsersByOrgID and GetStargazers; fix GetAllCommits headers; reverted some changed behaviors * set Page value on Home route * improved memory allocations * fixed response headers * removed logfiles * fixed import order * import order * improved swagger * added function to get models.ListOptions from context * removed pagesize diff on unit test * fixed imports * removed unnecessary struct field * fixed go fmt * scoped PR * code improvements * code improvements * go mod tidy * fixed import order * fixed commit statuses session * fixed files headers * fixed headers; added pagination for notifications * go mod tidy * go fmt * removed Private from user search options; added setting.UI.IssuePagingNum as default valeu on repo's issues list * Apply suggestions from code review Co-Authored-By: 6543 <6543@obermui.de> Co-Authored-By: zeripath <art27@cantab.net> * fixed build error * CI.restart() * fixed merge conflicts resolve * fixed conflicts resolve * improved FindTrackedTimesOptions.ToOptions() method * added backwards compatibility on ListReleases request; fixed issue tracked time ToSession * fixed build error; fixed swagger template * fixed swagger template * fixed ListReleases backwards compatibility * added page to user search route Co-authored-by: techknowlogick <matti@mdranta.net> Co-authored-by: 6543 <6543@obermui.de> Co-authored-by: zeripath <art27@cantab.net>
5 years ago
API add/generalize pagination (#9452) * paginate results * fixed deadlock * prevented breaking change * updated swagger * go fmt * fixed find topic * go mod tidy * go mod vendor with go1.13.5 * fixed repo find topics * fixed unit test * added Limit method to Engine struct; use engine variable when provided; fixed gitignore * use ItemsPerPage for default pagesize; fix GetWatchers, getOrgUsersByOrgID and GetStargazers; fix GetAllCommits headers; reverted some changed behaviors * set Page value on Home route * improved memory allocations * fixed response headers * removed logfiles * fixed import order * import order * improved swagger * added function to get models.ListOptions from context * removed pagesize diff on unit test * fixed imports * removed unnecessary struct field * fixed go fmt * scoped PR * code improvements * code improvements * go mod tidy * fixed import order * fixed commit statuses session * fixed files headers * fixed headers; added pagination for notifications * go mod tidy * go fmt * removed Private from user search options; added setting.UI.IssuePagingNum as default valeu on repo's issues list * Apply suggestions from code review Co-Authored-By: 6543 <6543@obermui.de> Co-Authored-By: zeripath <art27@cantab.net> * fixed build error * CI.restart() * fixed merge conflicts resolve * fixed conflicts resolve * improved FindTrackedTimesOptions.ToOptions() method * added backwards compatibility on ListReleases request; fixed issue tracked time ToSession * fixed build error; fixed swagger template * fixed swagger template * fixed ListReleases backwards compatibility * added page to user search route Co-authored-by: techknowlogick <matti@mdranta.net> Co-authored-by: 6543 <6543@obermui.de> Co-authored-by: zeripath <art27@cantab.net>
5 years ago
9 years ago
9 years ago
API add/generalize pagination (#9452) * paginate results * fixed deadlock * prevented breaking change * updated swagger * go fmt * fixed find topic * go mod tidy * go mod vendor with go1.13.5 * fixed repo find topics * fixed unit test * added Limit method to Engine struct; use engine variable when provided; fixed gitignore * use ItemsPerPage for default pagesize; fix GetWatchers, getOrgUsersByOrgID and GetStargazers; fix GetAllCommits headers; reverted some changed behaviors * set Page value on Home route * improved memory allocations * fixed response headers * removed logfiles * fixed import order * import order * improved swagger * added function to get models.ListOptions from context * removed pagesize diff on unit test * fixed imports * removed unnecessary struct field * fixed go fmt * scoped PR * code improvements * code improvements * go mod tidy * fixed import order * fixed commit statuses session * fixed files headers * fixed headers; added pagination for notifications * go mod tidy * go fmt * removed Private from user search options; added setting.UI.IssuePagingNum as default valeu on repo's issues list * Apply suggestions from code review Co-Authored-By: 6543 <6543@obermui.de> Co-Authored-By: zeripath <art27@cantab.net> * fixed build error * CI.restart() * fixed merge conflicts resolve * fixed conflicts resolve * improved FindTrackedTimesOptions.ToOptions() method * added backwards compatibility on ListReleases request; fixed issue tracked time ToSession * fixed build error; fixed swagger template * fixed swagger template * fixed ListReleases backwards compatibility * added page to user search route Co-authored-by: techknowlogick <matti@mdranta.net> Co-authored-by: 6543 <6543@obermui.de> Co-authored-by: zeripath <art27@cantab.net>
5 years ago
9 years ago
9 years ago
9 years ago
9 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804805806807808809810811812813814815816817818819820821822823824825826827828829830831832833834835836837838839840841842843844845846847848849850851852853854855856857858859860861862863864865866867868869870871872873874875876877878879880881882883884885886887888889890891892893894895896897898899900901902903904905906907908909910911912913914915916917918919920921922923924925926927928929930931932933934935936937938939940941942943944945946947948949950951952953954955956957958959960961962963964965966967968969970971972973974975976977978979980981982983984985986987988989990991992993994995996997998
  1. // Copyright 2017 The Gitea Authors. All rights reserved.
  2. // Copyright 2014 The Gogs Authors. All rights reserved.
  3. // Use of this source code is governed by a MIT-style
  4. // license that can be found in the LICENSE file.
  5. package repo
  6. import (
  7. "bytes"
  8. "code.gitea.io/gitea/modules/options"
  9. "encoding/base64"
  10. "fmt"
  11. gotemplate "html/template"
  12. "io/ioutil"
  13. "net/http"
  14. "net/url"
  15. "path"
  16. "sort"
  17. "strings"
  18. "time"
  19. "code.gitea.io/gitea/models"
  20. "code.gitea.io/gitea/modules/base"
  21. "code.gitea.io/gitea/modules/cache"
  22. "code.gitea.io/gitea/modules/charset"
  23. "code.gitea.io/gitea/modules/context"
  24. "code.gitea.io/gitea/modules/git"
  25. "code.gitea.io/gitea/modules/highlight"
  26. "code.gitea.io/gitea/modules/lfs"
  27. "code.gitea.io/gitea/modules/log"
  28. "code.gitea.io/gitea/modules/markup"
  29. "code.gitea.io/gitea/modules/setting"
  30. )
  31. const (
  32. tplRepoEMPTY base.TplName = "repo/empty"
  33. tplRepoHome base.TplName = "repo/home"
  34. tplCourseHome base.TplName = "repo/courseHome"
  35. tplWatchers base.TplName = "repo/watchers"
  36. tplForks base.TplName = "repo/forks"
  37. tplMigrating base.TplName = "repo/migrating"
  38. tplContributors base.TplName = "repo/contributors"
  39. )
  40. type namedBlob struct {
  41. name string
  42. isSymlink bool
  43. blob *git.Blob
  44. }
  45. // FIXME: There has to be a more efficient way of doing this
  46. func getReadmeFileFromPath(commit *git.Commit, treePath string) (*namedBlob, error) {
  47. tree, err := commit.SubTree(treePath)
  48. if err != nil {
  49. return nil, err
  50. }
  51. entries, err := tree.ListEntries()
  52. if err != nil {
  53. return nil, err
  54. }
  55. var readmeFiles [4]*namedBlob
  56. var exts = []string{".md", ".txt", ""} // sorted by priority
  57. for _, entry := range entries {
  58. if entry.IsDir() {
  59. continue
  60. }
  61. for i, ext := range exts {
  62. if markup.IsReadmeFile(entry.Name(), ext) {
  63. if readmeFiles[i] == nil || base.NaturalSortLess(readmeFiles[i].name, entry.Blob().Name()) {
  64. name := entry.Name()
  65. isSymlink := entry.IsLink()
  66. target := entry
  67. if isSymlink {
  68. target, err = entry.FollowLinks()
  69. if err != nil && !git.IsErrBadLink(err) {
  70. return nil, err
  71. }
  72. }
  73. if target != nil && (target.IsExecutable() || target.IsRegular()) {
  74. readmeFiles[i] = &namedBlob{
  75. name,
  76. isSymlink,
  77. target.Blob(),
  78. }
  79. }
  80. }
  81. }
  82. }
  83. if markup.IsReadmeFile(entry.Name()) {
  84. if readmeFiles[3] == nil || base.NaturalSortLess(readmeFiles[3].name, entry.Blob().Name()) {
  85. name := entry.Name()
  86. isSymlink := entry.IsLink()
  87. if isSymlink {
  88. entry, err = entry.FollowLinks()
  89. if err != nil && !git.IsErrBadLink(err) {
  90. return nil, err
  91. }
  92. }
  93. if entry != nil && (entry.IsExecutable() || entry.IsRegular()) {
  94. readmeFiles[3] = &namedBlob{
  95. name,
  96. isSymlink,
  97. entry.Blob(),
  98. }
  99. }
  100. }
  101. }
  102. }
  103. var readmeFile *namedBlob
  104. for _, f := range readmeFiles {
  105. if f != nil {
  106. readmeFile = f
  107. break
  108. }
  109. }
  110. return readmeFile, nil
  111. }
  112. func renderDirectory(ctx *context.Context, treeLink string) {
  113. tree, err := ctx.Repo.Commit.SubTree(ctx.Repo.TreePath)
  114. if err != nil {
  115. ctx.NotFoundOrServerError("Repo.Commit.SubTree", git.IsErrNotExist, err)
  116. return
  117. }
  118. entries, err := tree.ListEntries()
  119. if err != nil {
  120. ctx.ServerError("ListEntries", err)
  121. return
  122. }
  123. entries.CustomSort(base.NaturalSortLess)
  124. var c git.LastCommitCache
  125. if setting.CacheService.LastCommit.Enabled && ctx.Repo.CommitsCount >= setting.CacheService.LastCommit.CommitsCount {
  126. c = cache.NewLastCommitCache(ctx.Repo.Repository.FullName(), ctx.Repo.GitRepo, int64(setting.CacheService.LastCommit.TTL.Seconds()))
  127. }
  128. var latestCommit *git.Commit
  129. ctx.Data["Files"], latestCommit, err = entries.GetCommitsInfo(ctx.Repo.Commit, ctx.Repo.TreePath, c)
  130. if err != nil {
  131. ctx.ServerError("GetCommitsInfo", err)
  132. return
  133. }
  134. // 3 for the extensions in exts[] in order
  135. // the last one is for a readme that doesn't
  136. // strictly match an extension
  137. var readmeFiles [4]*namedBlob
  138. var docsEntries [3]*git.TreeEntry
  139. var exts = []string{".md", ".txt", ""} // sorted by priority
  140. for _, entry := range entries {
  141. if entry.IsDir() {
  142. lowerName := strings.ToLower(entry.Name())
  143. switch lowerName {
  144. case "docs":
  145. if entry.Name() == "docs" || docsEntries[0] == nil {
  146. docsEntries[0] = entry
  147. }
  148. case ".gitea":
  149. if entry.Name() == ".gitea" || docsEntries[1] == nil {
  150. docsEntries[1] = entry
  151. }
  152. case ".github":
  153. if entry.Name() == ".github" || docsEntries[2] == nil {
  154. docsEntries[2] = entry
  155. }
  156. }
  157. continue
  158. }
  159. for i, ext := range exts {
  160. if markup.IsReadmeFile(entry.Name(), ext) {
  161. log.Debug("%s", entry.Name())
  162. name := entry.Name()
  163. isSymlink := entry.IsLink()
  164. target := entry
  165. if isSymlink {
  166. target, err = entry.FollowLinks()
  167. if err != nil && !git.IsErrBadLink(err) {
  168. ctx.ServerError("FollowLinks", err)
  169. return
  170. }
  171. }
  172. log.Debug("%t", target == nil)
  173. if target != nil && (target.IsExecutable() || target.IsRegular()) {
  174. readmeFiles[i] = &namedBlob{
  175. name,
  176. isSymlink,
  177. target.Blob(),
  178. }
  179. }
  180. }
  181. }
  182. if markup.IsReadmeFile(entry.Name()) {
  183. name := entry.Name()
  184. isSymlink := entry.IsLink()
  185. if isSymlink {
  186. entry, err = entry.FollowLinks()
  187. if err != nil && !git.IsErrBadLink(err) {
  188. ctx.ServerError("FollowLinks", err)
  189. return
  190. }
  191. }
  192. if entry != nil && (entry.IsExecutable() || entry.IsRegular()) {
  193. readmeFiles[3] = &namedBlob{
  194. name,
  195. isSymlink,
  196. entry.Blob(),
  197. }
  198. }
  199. }
  200. }
  201. var readmeFile *namedBlob
  202. readmeTreelink := treeLink
  203. for _, f := range readmeFiles {
  204. if f != nil {
  205. readmeFile = f
  206. break
  207. }
  208. }
  209. if ctx.Repo.TreePath == "" && readmeFile == nil {
  210. for _, entry := range docsEntries {
  211. if entry == nil {
  212. continue
  213. }
  214. readmeFile, err = getReadmeFileFromPath(ctx.Repo.Commit, entry.GetSubJumpablePathName())
  215. if err != nil {
  216. ctx.ServerError("getReadmeFileFromPath", err)
  217. return
  218. }
  219. if readmeFile != nil {
  220. readmeFile.name = entry.Name() + "/" + readmeFile.name
  221. readmeTreelink = treeLink + "/" + entry.GetSubJumpablePathName()
  222. break
  223. }
  224. }
  225. }
  226. if readmeFile != nil {
  227. ctx.Data["RawFileLink"] = ""
  228. ctx.Data["ReadmeInList"] = true
  229. ctx.Data["ReadmeExist"] = true
  230. ctx.Data["FileIsSymlink"] = readmeFile.isSymlink
  231. if ctx.Repo.TreePath == "" {
  232. ctx.Data["ReadmeRelativePath"] = readmeFile.name
  233. } else {
  234. ctx.Data["ReadmeRelativePath"] = ctx.Repo.TreePath + "/" + readmeFile.name
  235. }
  236. if ctx.Repo.CanEnableEditor() {
  237. ctx.Data["CanEditFile"] = true
  238. }
  239. dataRc, err := readmeFile.blob.DataAsync()
  240. if err != nil {
  241. ctx.ServerError("Data", err)
  242. return
  243. }
  244. defer dataRc.Close()
  245. buf := make([]byte, 1024)
  246. n, _ := dataRc.Read(buf)
  247. buf = buf[:n]
  248. isTextFile := base.IsTextFile(buf)
  249. ctx.Data["FileIsText"] = isTextFile
  250. ctx.Data["FileName"] = readmeFile.name
  251. fileSize := int64(0)
  252. isLFSFile := false
  253. ctx.Data["IsLFSFile"] = false
  254. // FIXME: what happens when README file is an image?
  255. if isTextFile && setting.LFS.StartServer {
  256. meta := lfs.IsPointerFile(&buf)
  257. if meta != nil {
  258. meta, err = ctx.Repo.Repository.GetLFSMetaObjectByOid(meta.Oid)
  259. if err != nil && err != models.ErrLFSObjectNotExist {
  260. ctx.ServerError("GetLFSMetaObject", err)
  261. return
  262. }
  263. }
  264. if meta != nil {
  265. ctx.Data["IsLFSFile"] = true
  266. isLFSFile = true
  267. // OK read the lfs object
  268. var err error
  269. dataRc, err = lfs.ReadMetaObject(meta)
  270. if err != nil {
  271. ctx.ServerError("ReadMetaObject", err)
  272. return
  273. }
  274. defer dataRc.Close()
  275. buf = make([]byte, 1024)
  276. n, err = dataRc.Read(buf)
  277. if err != nil {
  278. ctx.ServerError("Data", err)
  279. return
  280. }
  281. buf = buf[:n]
  282. isTextFile = base.IsTextFile(buf)
  283. ctx.Data["IsTextFile"] = isTextFile
  284. fileSize = meta.Size
  285. ctx.Data["FileSize"] = meta.Size
  286. filenameBase64 := base64.RawURLEncoding.EncodeToString([]byte(readmeFile.name))
  287. ctx.Data["RawFileLink"] = fmt.Sprintf("%s%s.git/info/lfs/objects/%s/%s", setting.AppURL, ctx.Repo.Repository.FullName(), meta.Oid, filenameBase64)
  288. }
  289. }
  290. if !isLFSFile {
  291. fileSize = readmeFile.blob.Size()
  292. }
  293. if isTextFile {
  294. if fileSize >= setting.UI.MaxDisplayFileSize {
  295. // Pretend that this is a normal text file to display 'This file is too large to be shown'
  296. ctx.Data["IsFileTooLarge"] = true
  297. ctx.Data["IsTextFile"] = true
  298. ctx.Data["FileSize"] = fileSize
  299. } else {
  300. d, _ := ioutil.ReadAll(dataRc)
  301. buf = charset.ToUTF8WithFallback(append(buf, d...))
  302. if markupType := markup.Type(readmeFile.name); markupType != "" {
  303. ctx.Data["IsMarkup"] = true
  304. ctx.Data["MarkupType"] = string(markupType)
  305. ctx.Data["FileContent"] = string(markup.Render(readmeFile.name, buf, readmeTreelink, ctx.Repo.Repository.ComposeMetas()))
  306. } else {
  307. ctx.Data["IsRenderedHTML"] = true
  308. ctx.Data["FileContent"] = strings.Replace(
  309. gotemplate.HTMLEscapeString(string(buf)), "\n", `<br>`, -1,
  310. )
  311. }
  312. }
  313. }
  314. }
  315. // Show latest commit info of repository in table header,
  316. // or of directory if not in root directory.
  317. ctx.Data["LatestCommit"] = latestCommit
  318. verification := models.ParseCommitWithSignature(latestCommit)
  319. if err := models.CalculateTrustStatus(verification, ctx.Repo.Repository, nil); err != nil {
  320. ctx.ServerError("CalculateTrustStatus", err)
  321. return
  322. }
  323. ctx.Data["LatestCommitVerification"] = verification
  324. ctx.Data["LatestCommitUser"] = models.ValidateCommitWithEmail(latestCommit)
  325. statuses, err := models.GetLatestCommitStatus(ctx.Repo.Repository, ctx.Repo.Commit.ID.String(), 0)
  326. if err != nil {
  327. log.Error("GetLatestCommitStatus: %v", err)
  328. }
  329. ctx.Data["LatestCommitStatus"] = models.CalcCommitStatus(statuses)
  330. // Check permission to add or upload new file.
  331. if ctx.Repo.CanWrite(models.UnitTypeCode) && ctx.Repo.IsViewBranch {
  332. ctx.Data["CanAddFile"] = !ctx.Repo.Repository.IsArchived
  333. ctx.Data["CanUploadFile"] = setting.Repository.Upload.Enabled && !ctx.Repo.Repository.IsArchived
  334. }
  335. }
  336. func renderFile(ctx *context.Context, entry *git.TreeEntry, treeLink, rawLink string) {
  337. ctx.Data["IsViewFile"] = true
  338. blob := entry.Blob()
  339. dataRc, err := blob.DataAsync()
  340. if err != nil {
  341. ctx.ServerError("DataAsync", err)
  342. return
  343. }
  344. defer dataRc.Close()
  345. ctx.Data["Title"] = ctx.Data["Title"].(string) + " - " + ctx.Repo.TreePath + " at " + ctx.Repo.BranchName
  346. fileSize := blob.Size()
  347. ctx.Data["FileIsSymlink"] = entry.IsLink()
  348. ctx.Data["FileSize"] = fileSize
  349. ctx.Data["FileName"] = blob.Name()
  350. ctx.Data["HighlightClass"] = highlight.FileNameToHighlightClass(blob.Name())
  351. ctx.Data["RawFileLink"] = rawLink + "/" + ctx.Repo.TreePath
  352. buf := make([]byte, 1024)
  353. n, _ := dataRc.Read(buf)
  354. buf = buf[:n]
  355. isTextFile := base.IsTextFile(buf)
  356. isLFSFile := false
  357. ctx.Data["IsTextFile"] = isTextFile
  358. //Check for LFS meta file
  359. if isTextFile && setting.LFS.StartServer {
  360. meta := lfs.IsPointerFile(&buf)
  361. if meta != nil {
  362. meta, err = ctx.Repo.Repository.GetLFSMetaObjectByOid(meta.Oid)
  363. if err != nil && err != models.ErrLFSObjectNotExist {
  364. ctx.ServerError("GetLFSMetaObject", err)
  365. return
  366. }
  367. }
  368. if meta != nil {
  369. ctx.Data["IsLFSFile"] = true
  370. isLFSFile = true
  371. // OK read the lfs object
  372. var err error
  373. dataRc, err = lfs.ReadMetaObject(meta)
  374. if err != nil {
  375. ctx.ServerError("ReadMetaObject", err)
  376. return
  377. }
  378. defer dataRc.Close()
  379. buf = make([]byte, 1024)
  380. n, err = dataRc.Read(buf)
  381. if err != nil {
  382. ctx.ServerError("Data", err)
  383. return
  384. }
  385. buf = buf[:n]
  386. isTextFile = base.IsTextFile(buf)
  387. ctx.Data["IsTextFile"] = isTextFile
  388. fileSize = meta.Size
  389. ctx.Data["FileSize"] = meta.Size
  390. filenameBase64 := base64.RawURLEncoding.EncodeToString([]byte(blob.Name()))
  391. ctx.Data["RawFileLink"] = fmt.Sprintf("%s%s.git/info/lfs/objects/%s/%s", setting.AppURL, ctx.Repo.Repository.FullName(), meta.Oid, filenameBase64)
  392. }
  393. }
  394. // Check LFS Lock
  395. lfsLock, err := ctx.Repo.Repository.GetTreePathLock(ctx.Repo.TreePath)
  396. ctx.Data["LFSLock"] = lfsLock
  397. if err != nil {
  398. ctx.ServerError("GetTreePathLock", err)
  399. return
  400. }
  401. if lfsLock != nil {
  402. ctx.Data["LFSLockOwner"] = lfsLock.Owner.DisplayName()
  403. ctx.Data["LFSLockHint"] = ctx.Tr("repo.editor.this_file_locked")
  404. }
  405. // Assume file is not editable first.
  406. if isLFSFile {
  407. ctx.Data["EditFileTooltip"] = ctx.Tr("repo.editor.cannot_edit_lfs_files")
  408. } else if !isTextFile {
  409. ctx.Data["EditFileTooltip"] = ctx.Tr("repo.editor.cannot_edit_non_text_files")
  410. }
  411. switch {
  412. case isTextFile:
  413. if fileSize >= setting.UI.MaxDisplayFileSize {
  414. ctx.Data["IsFileTooLarge"] = true
  415. break
  416. }
  417. d, _ := ioutil.ReadAll(dataRc)
  418. buf = charset.ToUTF8WithFallback(append(buf, d...))
  419. readmeExist := markup.IsReadmeFile(blob.Name())
  420. ctx.Data["ReadmeExist"] = readmeExist
  421. if markupType := markup.Type(blob.Name()); markupType != "" {
  422. ctx.Data["IsMarkup"] = true
  423. ctx.Data["MarkupType"] = markupType
  424. ctx.Data["FileContent"] = string(markup.Render(blob.Name(), buf, path.Dir(treeLink), ctx.Repo.Repository.ComposeMetas()))
  425. } else if readmeExist {
  426. ctx.Data["IsRenderedHTML"] = true
  427. ctx.Data["FileContent"] = strings.Replace(
  428. gotemplate.HTMLEscapeString(string(buf)), "\n", `<br>`, -1,
  429. )
  430. } else {
  431. // Building code view blocks with line number on server side.
  432. var fileContent string
  433. if content, err := charset.ToUTF8WithErr(buf); err != nil {
  434. log.Error("ToUTF8WithErr: %v", err)
  435. fileContent = string(buf)
  436. } else {
  437. fileContent = content
  438. }
  439. var output bytes.Buffer
  440. lines := strings.Split(fileContent, "\n")
  441. ctx.Data["NumLines"] = len(lines)
  442. if len(lines) == 1 && lines[0] == "" {
  443. // If the file is completely empty, we show zero lines at the line counter
  444. ctx.Data["NumLines"] = 0
  445. }
  446. ctx.Data["NumLinesSet"] = true
  447. //Remove blank line at the end of file
  448. if len(lines) > 0 && lines[len(lines)-1] == "" {
  449. lines = lines[:len(lines)-1]
  450. }
  451. for index, line := range lines {
  452. line = gotemplate.HTMLEscapeString(line)
  453. if index != len(lines)-1 {
  454. line += "\n"
  455. }
  456. output.WriteString(fmt.Sprintf(`<li class="L%d" rel="L%d">%s</li>`, index+1, index+1, line))
  457. }
  458. ctx.Data["FileContent"] = gotemplate.HTML(output.String())
  459. output.Reset()
  460. for i := 0; i < len(lines); i++ {
  461. output.WriteString(fmt.Sprintf(`<span id="L%[1]d" data-line-number="%[1]d"></span>`, i+1))
  462. }
  463. ctx.Data["LineNums"] = gotemplate.HTML(output.String())
  464. }
  465. if !isLFSFile {
  466. if ctx.Repo.CanEnableEditor() {
  467. if lfsLock != nil && lfsLock.OwnerID != ctx.User.ID {
  468. ctx.Data["CanEditFile"] = false
  469. ctx.Data["EditFileTooltip"] = ctx.Tr("repo.editor.this_file_locked")
  470. } else {
  471. ctx.Data["CanEditFile"] = true
  472. ctx.Data["EditFileTooltip"] = ctx.Tr("repo.editor.edit_this_file")
  473. }
  474. } else if !ctx.Repo.IsViewBranch {
  475. ctx.Data["EditFileTooltip"] = ctx.Tr("repo.editor.must_be_on_a_branch")
  476. } else if !ctx.Repo.CanWrite(models.UnitTypeCode) {
  477. ctx.Data["EditFileTooltip"] = ctx.Tr("repo.editor.fork_before_edit")
  478. }
  479. }
  480. case base.IsPDFFile(buf):
  481. ctx.Data["IsPDFFile"] = true
  482. case base.IsVideoFile(buf):
  483. ctx.Data["IsVideoFile"] = true
  484. case base.IsAudioFile(buf):
  485. ctx.Data["IsAudioFile"] = true
  486. case base.IsImageFile(buf):
  487. ctx.Data["IsImageFile"] = true
  488. default:
  489. if fileSize >= setting.UI.MaxDisplayFileSize {
  490. ctx.Data["IsFileTooLarge"] = true
  491. break
  492. }
  493. if markupType := markup.Type(blob.Name()); markupType != "" {
  494. d, _ := ioutil.ReadAll(dataRc)
  495. buf = append(buf, d...)
  496. ctx.Data["IsMarkup"] = true
  497. ctx.Data["MarkupType"] = markupType
  498. ctx.Data["FileContent"] = string(markup.Render(blob.Name(), buf, path.Dir(treeLink), ctx.Repo.Repository.ComposeMetas()))
  499. }
  500. }
  501. if ctx.Repo.CanEnableEditor() {
  502. if lfsLock != nil && lfsLock.OwnerID != ctx.User.ID {
  503. ctx.Data["CanDeleteFile"] = false
  504. ctx.Data["DeleteFileTooltip"] = ctx.Tr("repo.editor.this_file_locked")
  505. } else {
  506. ctx.Data["CanDeleteFile"] = true
  507. ctx.Data["DeleteFileTooltip"] = ctx.Tr("repo.editor.delete_this_file")
  508. }
  509. } else if !ctx.Repo.IsViewBranch {
  510. ctx.Data["DeleteFileTooltip"] = ctx.Tr("repo.editor.must_be_on_a_branch")
  511. } else if !ctx.Repo.CanWrite(models.UnitTypeCode) {
  512. ctx.Data["DeleteFileTooltip"] = ctx.Tr("repo.editor.must_have_write_access")
  513. }
  514. }
  515. func safeURL(address string) string {
  516. u, err := url.Parse(address)
  517. if err != nil {
  518. return address
  519. }
  520. u.User = nil
  521. return u.String()
  522. }
  523. type ContributorInfo struct {
  524. UserInfo *models.User // nil for contributor who is not a registered user
  525. RelAvatarLink string `json:"rel_avatar_link"`
  526. UserName string `json:"user_name"`
  527. Email string `json:"email"`
  528. CommitCnt int `json:"commit_cnt"`
  529. }
  530. type GetContributorsInfo struct {
  531. ErrorCode int `json:"error_code"`
  532. ErrorMsg string `json:"error_msg"`
  533. Count int `json:"count"`
  534. ContributorInfo []*ContributorInfo `json:"contributor_info"`
  535. }
  536. func getContributorInfo(contributorInfos []*ContributorInfo, email string) *ContributorInfo {
  537. for _, c := range contributorInfos {
  538. if strings.Compare(c.Email, email) == 0 {
  539. return c
  540. }
  541. }
  542. return nil
  543. }
  544. // Home render repository home page
  545. func Home(ctx *context.Context) {
  546. if ctx.Repo.CanEnableEditor() {
  547. ctx.Data["CanEditFile"] = true
  548. } else {
  549. ctx.Data["CanEditFile"] = false
  550. }
  551. if len(ctx.Repo.Units) > 0 {
  552. //get repo contributors info
  553. contributors, err := git.GetContributors(ctx.Repo.Repository.RepoPath(), ctx.Repo.BranchName)
  554. if err == nil && contributors != nil {
  555. startTime := time.Now()
  556. var contributorInfos []*ContributorInfo
  557. contributorInfoHash := make(map[string]*ContributorInfo)
  558. count := 0
  559. for _, c := range contributors {
  560. if count >= 25 {
  561. continue
  562. }
  563. if strings.Compare(c.Email, "") == 0 {
  564. continue
  565. }
  566. // get user info from committer email
  567. user, err := models.GetUserByActivateEmail(c.Email)
  568. if err == nil {
  569. // committer is system user, get info through user's primary email
  570. if existedContributorInfo, ok := contributorInfoHash[user.Email]; ok {
  571. // existed: same primary email, different committer name
  572. existedContributorInfo.CommitCnt += c.CommitCnt
  573. } else {
  574. // new committer info
  575. var newContributor = &ContributorInfo{
  576. user, user.RelAvatarLink(), user.Name, user.Email, c.CommitCnt,
  577. }
  578. count++
  579. contributorInfos = append(contributorInfos, newContributor)
  580. contributorInfoHash[user.Email] = newContributor
  581. }
  582. } else {
  583. // committer is not system user
  584. if existedContributorInfo, ok := contributorInfoHash[c.Email]; ok {
  585. // existed: same primary email, different committer name
  586. existedContributorInfo.CommitCnt += c.CommitCnt
  587. } else {
  588. var newContributor = &ContributorInfo{
  589. user, "", "", c.Email, c.CommitCnt,
  590. }
  591. count++
  592. contributorInfos = append(contributorInfos, newContributor)
  593. contributorInfoHash[c.Email] = newContributor
  594. }
  595. }
  596. }
  597. ctx.Data["ContributorInfo"] = contributorInfos
  598. var duration = time.Since(startTime)
  599. log.Info("getContributorInfo cost: %v seconds", duration.Seconds())
  600. }
  601. if ctx.Repo.Repository.IsBeingCreated() {
  602. task, err := models.GetMigratingTask(ctx.Repo.Repository.ID)
  603. if err != nil {
  604. ctx.ServerError("models.GetMigratingTask", err)
  605. return
  606. }
  607. cfg, err := task.MigrateConfig()
  608. if err != nil {
  609. ctx.ServerError("task.MigrateConfig", err)
  610. return
  611. }
  612. ctx.Data["Repo"] = ctx.Repo
  613. ctx.Data["MigrateTask"] = task
  614. ctx.Data["CloneAddr"] = safeURL(cfg.CloneAddr)
  615. ctx.HTML(200, tplMigrating)
  616. return
  617. }
  618. var firstUnit *models.Unit
  619. for _, repoUnit := range ctx.Repo.Units {
  620. if repoUnit.Type == models.UnitTypeCode {
  621. renderCode(ctx)
  622. return
  623. }
  624. unit, ok := models.Units[repoUnit.Type]
  625. if ok && (firstUnit == nil || !firstUnit.IsLessThan(unit)) {
  626. firstUnit = &unit
  627. }
  628. }
  629. if firstUnit != nil {
  630. ctx.Redirect(fmt.Sprintf("%s/%s%s", setting.AppSubURL, ctx.Repo.Repository.FullName(), firstUnit.URI))
  631. return
  632. }
  633. }
  634. ctx.NotFound("Home", fmt.Errorf(ctx.Tr("units.error.no_unit_allowed_repo")))
  635. }
  636. func renderLicense(ctx *context.Context) {
  637. entry, err := ctx.Repo.Commit.GetTreeEntryByPath("LICENSE")
  638. if err != nil {
  639. log.Error(err.Error())
  640. return
  641. }
  642. blob := entry.Blob()
  643. dataRc, err := blob.DataAsync()
  644. if err != nil {
  645. log.Error("DataAsync", err)
  646. return
  647. }
  648. defer dataRc.Close()
  649. buf, err := ioutil.ReadAll(dataRc)
  650. if err != nil {
  651. log.Error("DataAsync", err)
  652. return
  653. }
  654. for _, f := range models.Licenses {
  655. license, err := options.License(f)
  656. if err != nil {
  657. log.Error("failed to get license content: %v, err:%v", f, err)
  658. continue
  659. }
  660. if bytes.Compare(buf, license) == 0 {
  661. log.Info("got matched license:%v", f)
  662. ctx.Data["LICENSE"] = f
  663. return
  664. }
  665. }
  666. log.Info("not found matched license,repo:%v", ctx.Repo.Repository.Name)
  667. }
  668. func renderLanguageStats(ctx *context.Context) {
  669. langs, err := ctx.Repo.Repository.GetTopLanguageStats(5)
  670. if err != nil {
  671. ctx.ServerError("Repo.GetTopLanguageStats", err)
  672. return
  673. }
  674. ctx.Data["LanguageStats"] = langs
  675. }
  676. func renderRepoTopics(ctx *context.Context) {
  677. topics, err := models.FindTopics(&models.FindTopicOptions{
  678. RepoID: ctx.Repo.Repository.ID,
  679. })
  680. if err != nil {
  681. ctx.ServerError("models.FindTopics", err)
  682. return
  683. }
  684. ctx.Data["Topics"] = topics
  685. }
  686. func renderCode(ctx *context.Context) {
  687. ctx.Data["PageIsViewCode"] = true
  688. if ctx.Repo.Repository.IsEmpty {
  689. ctx.HTML(200, tplRepoEMPTY)
  690. return
  691. }
  692. title := ctx.Repo.Repository.Owner.Name + "/" + ctx.Repo.Repository.Name
  693. if len(ctx.Repo.Repository.Description) > 0 {
  694. title += ": " + ctx.Repo.Repository.Description
  695. }
  696. ctx.Data["Title"] = title
  697. ctx.Data["RequireHighlightJS"] = true
  698. branchLink := ctx.Repo.RepoLink + "/src/" + ctx.Repo.BranchNameSubURL()
  699. treeLink := branchLink
  700. rawLink := ctx.Repo.RepoLink + "/raw/" + ctx.Repo.BranchNameSubURL()
  701. if len(ctx.Repo.TreePath) > 0 {
  702. treeLink += "/" + ctx.Repo.TreePath
  703. }
  704. // Get Topics of this repo
  705. renderRepoTopics(ctx)
  706. // Get license of this repo
  707. renderLicense(ctx)
  708. if ctx.Written() {
  709. return
  710. }
  711. // Get current entry user currently looking at.
  712. entry, err := ctx.Repo.Commit.GetTreeEntryByPath(ctx.Repo.TreePath)
  713. if err != nil {
  714. ctx.NotFoundOrServerError("Repo.Commit.GetTreeEntryByPath", git.IsErrNotExist, err)
  715. return
  716. }
  717. renderLanguageStats(ctx)
  718. if ctx.Written() {
  719. return
  720. }
  721. if entry.IsDir() {
  722. renderDirectory(ctx, treeLink)
  723. } else {
  724. renderFile(ctx, entry, treeLink, rawLink)
  725. }
  726. if ctx.Written() {
  727. return
  728. }
  729. var treeNames []string
  730. paths := make([]string, 0, 5)
  731. if len(ctx.Repo.TreePath) > 0 {
  732. treeNames = strings.Split(ctx.Repo.TreePath, "/")
  733. for i := range treeNames {
  734. paths = append(paths, strings.Join(treeNames[:i+1], "/"))
  735. }
  736. ctx.Data["HasParentPath"] = true
  737. if len(paths)-2 >= 0 {
  738. ctx.Data["ParentPath"] = "/" + paths[len(paths)-2]
  739. }
  740. }
  741. //如果是fork的仓库
  742. if ctx.Repo.Repository.IsFork {
  743. //获得fetchUpstream对应的分支参数
  744. /*
  745. // 1. /{:baseOwner}/{:baseRepoName}/compare/{:baseBranch}...{:headBranch}
  746. // 2. /{:baseOwner}/{:baseRepoName}/compare/{:baseBranch}...{:headOwner}:{:headBranch}
  747. // 3. /{:baseOwner}/{:baseRepoName}/compare/{:baseBranch}...{:headOwner}/{:headRepoName}:{:headBranch}
  748. */
  749. baseGitRepo, err := git.OpenRepository(ctx.Repo.Repository.BaseRepo.RepoPath())
  750. defer baseGitRepo.Close()
  751. var compareInfo *git.CompareInfo
  752. if err != nil {
  753. log.Error("error open baseRepo:%s", ctx.Repo.Repository.BaseRepo.RepoPath())
  754. ctx.Data["FetchUpstreamCnt"] = -1 // minus value indicates error
  755. } else {
  756. if _, error := baseGitRepo.GetBranch(ctx.Repo.BranchName); error == nil {
  757. //base repo has the same branch, then compare between current repo branch and base repo's branch
  758. compareInfo, err = baseGitRepo.GetCompareInfo(ctx.Repo.Repository.RepoPath(), ctx.Repo.BranchName, ctx.Repo.BranchName)
  759. ctx.Data["UpstreamSameBranchName"] = true
  760. } else {
  761. //else, compare between current repo branch and base repo's default branch
  762. compareInfo, err = baseGitRepo.GetCompareInfo(ctx.Repo.Repository.RepoPath(), ctx.Repo.BranchName, ctx.Repo.Repository.BaseRepo.DefaultBranch)
  763. ctx.Data["UpstreamSameBranchName"] = false
  764. }
  765. if err == nil && compareInfo != nil {
  766. if compareInfo.Commits != nil {
  767. log.Info("compareInfoCommits数量:%d", compareInfo.Commits.Len())
  768. ctx.Data["FetchUpstreamCnt"] = compareInfo.Commits.Len()
  769. } else {
  770. log.Info("compareInfo nothing different")
  771. ctx.Data["FetchUpstreamCnt"] = 0
  772. }
  773. } else {
  774. ctx.Data["FetchUpstreamCnt"] = -1 // minus value indicates error
  775. }
  776. }
  777. }
  778. ctx.Data["Paths"] = paths
  779. ctx.Data["TreeLink"] = treeLink
  780. ctx.Data["TreeNames"] = treeNames
  781. ctx.Data["BranchLink"] = branchLink
  782. if ctx.Repo.Repository.RepoType == models.RepoCourse {
  783. ctx.HTML(200, tplCourseHome)
  784. } else {
  785. ctx.HTML(200, tplRepoHome)
  786. }
  787. }
  788. // RenderUserCards render a page show users according the input templaet
  789. func RenderUserCards(ctx *context.Context, total int, getter func(opts models.ListOptions) ([]*models.User, error), tpl base.TplName) {
  790. page := ctx.QueryInt("page")
  791. if page <= 0 {
  792. page = 1
  793. }
  794. pager := context.NewPagination(total, models.ItemsPerPage, page, 5)
  795. ctx.Data["Page"] = pager
  796. items, err := getter(models.ListOptions{Page: pager.Paginater.Current()})
  797. if err != nil {
  798. ctx.ServerError("getter", err)
  799. return
  800. }
  801. ctx.Data["Cards"] = items
  802. ctx.HTML(200, tpl)
  803. }
  804. // Watchers render repository's watch users
  805. func Watchers(ctx *context.Context) {
  806. ctx.Data["Title"] = ctx.Tr("repo.watchers")
  807. ctx.Data["CardsTitle"] = ctx.Tr("repo.watchers")
  808. ctx.Data["PageIsWatchers"] = true
  809. RenderUserCards(ctx, ctx.Repo.Repository.NumWatches, ctx.Repo.Repository.GetWatchers, tplWatchers)
  810. }
  811. // Stars render repository's starred users
  812. func Stars(ctx *context.Context) {
  813. ctx.Data["Title"] = ctx.Tr("repo.stargazers")
  814. ctx.Data["CardsTitle"] = ctx.Tr("repo.stargazers")
  815. ctx.Data["PageIsStargazers"] = true
  816. RenderUserCards(ctx, ctx.Repo.Repository.NumStars, ctx.Repo.Repository.GetStargazers, tplWatchers)
  817. }
  818. // Forks render repository's forked users
  819. func Forks(ctx *context.Context) {
  820. ctx.Data["Title"] = ctx.Tr("repos.forks")
  821. forks, err := ctx.Repo.Repository.GetForks(models.ListOptions{})
  822. if err != nil {
  823. ctx.ServerError("GetForks", err)
  824. return
  825. }
  826. for _, fork := range forks {
  827. if err = fork.GetOwner(); err != nil {
  828. ctx.ServerError("GetOwner", err)
  829. return
  830. }
  831. }
  832. ctx.Data["Forks"] = forks
  833. ctx.HTML(200, tplForks)
  834. }
  835. func Contributors(ctx *context.Context) {
  836. ctx.Data["PageIsViewCode"] = true
  837. ctx.HTML(http.StatusOK, tplContributors)
  838. }
  839. func ContributorsAPI(ctx *context.Context) {
  840. count := 0
  841. errorCode := 0
  842. errorMsg := ""
  843. branchOrTag := ctx.Query("name")
  844. contributors, err := git.GetContributors(ctx.Repo.Repository.RepoPath(), branchOrTag)
  845. var contributorInfos []*ContributorInfo
  846. if err == nil && contributors != nil {
  847. contributorInfoHash := make(map[string]*ContributorInfo)
  848. for _, c := range contributors {
  849. if strings.Compare(c.Email, "") == 0 {
  850. continue
  851. }
  852. // get user info from committer email
  853. user, err := models.GetUserByActivateEmail(c.Email)
  854. if err == nil {
  855. // committer is system user, get info through user's primary email
  856. if existedContributorInfo, ok := contributorInfoHash[user.Email]; ok {
  857. // existed: same primary email, different committer name
  858. existedContributorInfo.CommitCnt += c.CommitCnt
  859. } else {
  860. // new committer info
  861. var newContributor = &ContributorInfo{
  862. user, user.RelAvatarLink(), user.Name, user.Email, c.CommitCnt,
  863. }
  864. count++
  865. contributorInfos = append(contributorInfos, newContributor)
  866. contributorInfoHash[user.Email] = newContributor
  867. }
  868. } else {
  869. // committer is not system user
  870. if existedContributorInfo, ok := contributorInfoHash[c.Email]; ok {
  871. // existed: same primary email, different committer name
  872. existedContributorInfo.CommitCnt += c.CommitCnt
  873. } else {
  874. var newContributor = &ContributorInfo{
  875. user, "", "", c.Email, c.CommitCnt,
  876. }
  877. count++
  878. contributorInfos = append(contributorInfos, newContributor)
  879. contributorInfoHash[c.Email] = newContributor
  880. }
  881. }
  882. }
  883. sort.Slice(contributorInfos, func(i, j int) bool {
  884. return contributorInfos[i].CommitCnt > contributorInfos[j].CommitCnt
  885. })
  886. } else {
  887. log.Error("GetContributors failed: %v", err)
  888. errorCode = -1
  889. errorMsg = err.Error()
  890. }
  891. ctx.JSON(http.StatusOK, GetContributorsInfo{
  892. ErrorCode: errorCode,
  893. ErrorMsg: errorMsg,
  894. Count: count,
  895. ContributorInfo: contributorInfos,
  896. })
  897. }