??xml version="1.0" encoding="utf-8" standalone="yes"?>ս4ǿ:C++博客-?/title><link>//www.pppqb.icu/guijie/</link><description>杰哥?哈哈!</description><language>zh - սƵ2019|սع//www.pppqb.icu/guijie/archive/2019/09/07/216792.html杰哥杰哥Sat, 07 Sep 2019 03:09:00 GMT//www.pppqb.icu/guijie/archive/2019/09/07/216792.html//www.pppqb.icu/guijie/comments/216792.html//www.pppqb.icu/guijie/archive/2019/09/07/216792.html#Feedback0//www.pppqb.icu/guijie/comments/commentRss/216792.html//www.pppqb.icu/guijie/services/trackbacks/216792.html

下班本是仉兴事Q但是往往也面临着一个选择题:今天到底要不要关机呢Q?/p>


如果工作5天不xQ周末关Z产生很大危害吗?


今天先来说结论:长期不关机的有危害Q一是损耗能源,二是对电脑有损害?/strong>


具体有什么危宻I又该如何应对呢?下面编Z一一解答?/p>


01

长期不关机有什么危宻I

长期不关电脑Q要是仅仅挂着QQ、开个浏览器q好Q要是挂着游戏的话Q主?amp;盘长期处于q{状态,对其寿命的损?/strong>也是hin大滴Q?br style="box-sizing: border-box;" />


其次Q电脑长期不x也往往伴随着发热的问题。电子元器g受温度媄响尤其严重,电脑卡慢、元件损耗也在所隑օ。说到这里,我仿佛闻C一股h民币烧焦的味?#8230;…

一脸哀?/p>


但是每天x开Z好麻烦啊Q万一我只是短暂的d几个时也必d机吗Q非也非也,电脑可不是只有【开机】、【关机】两U状态哦~


02

我就不想xQ还有其他选择吗?


那么电脑在不同的使用场景下,应该采用什么对应的模式呢?


1.待机

“待机”是系l将当前状态保存于内存中,然后退出系l,此时甉|消耗降低,l持CPU、内存和盘最低限度的q行Q按计算Z的电源键可以激zȝl,电脑q速从内存中调入待机前状态进入系l,q是重新开机最快的方式Q但是系lƈ未真正关闭,适用短暂d使用?/span>



2.休眠和睡?br style="box-sizing: border-box;" />


休眠和睡眠,从字面上来讲区别不大Q所以常常让很多人很疑惑Q这两种状态到底有何不同,它们的优~点如下Q?/p>


“睡眠” 是Windowspȝ提供的一U节能状态,当用户需要离开电脑的时候,可以?#8220;x”菜单里面选择q个命o。这Ll将正在处理的数据保存到内存中,除内存以外的所有设备都停止供电。当用户回到电脑面前的时候,通过Ud鼠标或按下Q意键Q就可以唤醒pȝQ让甉|为其他设备重新开始供c在短短的几U钟Q计机可以恢复到d之前的状态?span style="box-sizing: border-box; background-color: rgba(159, 238, 255, 0.72);">q种状态的特点是唤醒速度快(一般只需要几U钟Q,但是不能让内存断电,否则数据会丢失?/strong>


“休眠” 是微软ؓUd电脑设计的一U电源节能状态,所以在台式机的“x”菜单里面看不到它。这U状态是把内存里的所有数据,存储到硬盘的一个特定空间里Q然后切断所有的甉|供应。以后当用户重新按下开机键Q就会将盘里时存储的内存数据恢复到内存里Q就可以q回到离开之前的状态?span style="box-sizing: border-box; background-color: rgba(159, 238, 255, 0.72);">所以这U状态的特点是可以复原到断电以前的状态,而且可以完全断电Q但是恢复的旉较长而且需要占用较大的盘I间?/strong>


3.混合睡眠

“混合睡眠”是针对台式电脑推出的一U睡?休眠Ҏ?/p>


该技术指睡眠和休眠同时进?/strong>Q也是_先把信息写到盘的hiberfil.sys文g中,再进行睡眠。如果没有终止外部供电,则睡眠,如终止外部供电,则再ơ开机时dhiberfil.sysq行恢复Q可以说整合了两者的优势Q?span style="box-sizing: border-box; background-color: rgba(159, 238, 255, 0.72);">比v睡眠有防掉电的功能,且比起休眠在不终止外部供늚情况下启动速度更快?/strong>q一可以在【高U电源设|】中调整Q将【允许؜合睡眠】开启:


4.长期不用要妥善保管

电脑长期不用,直接束之高阁的话Q再ơ翻出来很有可能无法正常使用Q显C器׃、键盘积C、电池老化了、外观破损了……哪一U情况发生了都会让hp心?/p>


如何解决q个问题Q?a style="box-sizing: border-box;" target="_blank" data-linktype="2">戟뀋》电脑长期不使用Q该怎么收纳Q?/span>


希望以上多个实用Ҏ都可以帮我们保养自己的电脑,廉使用寿命Q降低维修成本~


当然Q如果你像小~一P热爱ȝ卛_工作的这U用模式或者设备已l因此而出现异响,变卡变慢Q操作滞待,异常L{情况,也可?span style="color: rgba(67, 62, 62, 0.89); box-sizing: border-box; background-color: rgba(159, 238, 255, 0.72);">在线选购联想软g服务Q联惌备专家将为您提供一对一咨询Q帮你解决电脑问?/span>?/p>

Reference

https://mp.weixin.qq.com/s/pYPtMrmtdh454VEEcXIG9A


杰哥 2019-09-07 11:09 发表评论
]]>
[zz]读China Daily学英?/title><link>//www.pppqb.icu/guijie/archive/2019/06/27/216463.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Wed, 26 Jun 2019 17:41:00 GMT</pubDate><guid>//www.pppqb.icu/guijie/archive/2019/06/27/216463.html</guid><wfw:comment>//www.pppqb.icu/guijie/comments/216463.html</wfw:comment><comments>//www.pppqb.icu/guijie/archive/2019/06/27/216463.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>//www.pppqb.icu/guijie/comments/commentRss/216463.html</wfw:commentRss><trackback:ping>//www.pppqb.icu/guijie/services/trackbacks/216463.html</trackback:ping><description><![CDATA[<span style="color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, "Helvetica Neue", "PingFang SC", "Microsoft YaHei", "Source Han Sans SC", "Noto Sans CJK SC", "WenQuanYi Micro Hei", sans-serif; font-size: medium; background-color: #ffffff;">目前不少人对China Daily存在偏见Q报U怸的文章都是中国h写的Q因此不够地道。但q种偏见完全站不住脚Q因Z国日报的记者都是资p文写手,写作水^要远绝大多数读者,而且文章发表前还有外c专家润色审E?/span><span style="color: #1a1a1a; font-family: -apple-system, BlinkMacSystemFont, "Helvetica Neue", "PingFang SC", "Microsoft YaHei", "Source Han Sans SC", "Noto Sans CJK SC", "WenQuanYi Micro Hei", sans-serif; font-size: medium; background-color: #ffffff;">?/span><br />Reference: <a >https://zhuanlan.zhihu.com/p/49847636</a><img src ="//www.pppqb.icu/guijie/aggbug/216463.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="//www.pppqb.icu/guijie/" target="_blank">杰哥</a> 2019-06-27 01:41 <a href="//www.pppqb.icu/guijie/archive/2019/06/27/216463.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>Jensen-Shannon divergence - սƵ2019|սع//www.pppqb.icu/guijie/archive/2019/06/20/216430.html杰哥杰哥Wed, 19 Jun 2019 17:44:00 GMT//www.pppqb.icu/guijie/archive/2019/06/20/216430.html//www.pppqb.icu/guijie/comments/216430.html//www.pppqb.icu/guijie/archive/2019/06/20/216430.html#Feedback0//www.pppqb.icu/guijie/comments/commentRss/216430.html//www.pppqb.icu/guijie/services/trackbacks/216430.htmlhttps://www.mathworks.com/matlabcentral/fileexchange/20689-jensen-shannon-divergence

杰哥 2019-06-20 01:44 发表评论
]]>
[zz] Z么说囑փ的低频是轮廓Q高频是噪声和细?/title><link>//www.pppqb.icu/guijie/archive/2019/06/19/216424.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Tue, 18 Jun 2019 17:01:00 GMT</pubDate><guid>//www.pppqb.icu/guijie/archive/2019/06/19/216424.html</guid><wfw:comment>//www.pppqb.icu/guijie/comments/216424.html</wfw:comment><comments>//www.pppqb.icu/guijie/archive/2019/06/19/216424.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>//www.pppqb.icu/guijie/comments/commentRss/216424.html</wfw:commentRss><trackback:ping>//www.pppqb.icu/guijie/services/trackbacks/216424.html</trackback:ping><description><![CDATA[<div class="dpun"><font color="#0000ee">囑փ的频率:灰度值变化剧烈程度的指标Q是灰度在^面空间上的梯度?/font></div><div class="dpun"><font color="#0000ee"><br /></font></div><div class="dpun"><font color="#0000ee">Q?Q什么是低频?</font></div><div class="dpun"><font color="#0000ee">      低频是颜色~慢地变?也就是灰度缓慢地变化,׃表着那是q箋渐变的一块区?q部分就是低? 对于一q图像来_除去高频的就是低频了Q也是边缘以内的内容ؓ低频Q而边~内的内容就是图像的大部分信息,卛_像的大致概貌和轮廓,是图像的q似信息?/font></div><div class="dpun"><font color="#0000ee"><br /></font></div><div class="dpun"><font color="#0000ee">Q?Q什么是高频?</font></div><div class="dpun"><font color="#0000ee"><br /></font></div><div class="dpun"><font color="#0000ee">     反过? 高频是频率变化?囑փ中什么时候灰度变化快?是盔R区域之间灰度相差很大,q就是变化得?囑փ?一个媄像与背景的边~部?通常会有明显的差?也就是说变化那条边线那里,灰度变化很快,也即是变化频率高的部?因此Q图像边~的灰度值变化快Q就对应着频率高,即高频显C图像边~。图像的l节处也是属于灰度值急剧变化的区域,正是因ؓ灰度值的急剧变化Q才会出现细节?/font></div><div class="dpun"><font color="#0000ee">      另外噪声Q即噪点Q也是这?在一个像素所在的位置,之所以是噪点,是因ؓ它与正常的点颜色不一样了Q也是说该像素点灰度值明显不一样了,,也就是灰度有快速地变化?所以是高频部分Q因此有噪声在高频这么一说?/font></div><div class="dpun"><font color="#0000ee"><br /></font></div><div class="dpun"><font color="#0000ee">      其实归根到底,是因为我们hD别物体就是这L.假如你穿一个红?服在U色背景布前拍?你能很好地识别么?不能,因ؓ?服与背景融ؓ一体了,没有变化,所以看不出?除非有灯光从某解度照在h物n?q样边缘处会出现高亮和阴?q样我们p看到一些轮廓线,q些U就是颜Ԍ即灰度)很不一L地方.</font></div><div class="dpun"><font color="#0000ee">--------------------- </font></div><div class="dpun"><font color="#0000ee">作者:charlene_bo </font></div><div class="dpun"><font color="#0000ee">来源QCSDN </font></div><div class="dpun"><font color="#0000ee">原文Qhttps://blog.csdn.net/charlene_bo/article/details/70877999 </font></div><div class="dpun"><font color="#0000ee">版权声明Q本文ؓ博主原创文章Q{载请附上博文链接Q?/font></div><img src ="//www.pppqb.icu/guijie/aggbug/216424.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="//www.pppqb.icu/guijie/" target="_blank">杰哥</a> 2019-06-19 01:01 <a href="//www.pppqb.icu/guijie/archive/2019/06/19/216424.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>用qq传送文Ӟ不用微信传送文?/title><link>//www.pppqb.icu/guijie/archive/2019/06/19/216423.html</link><dc:creator>杰哥</dc:creator><author>杰哥</author><pubDate>Tue, 18 Jun 2019 16:01:00 GMT</pubDate><guid>//www.pppqb.icu/guijie/archive/2019/06/19/216423.html</guid><wfw:comment>//www.pppqb.icu/guijie/comments/216423.html</wfw:comment><comments>//www.pppqb.icu/guijie/archive/2019/06/19/216423.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>//www.pppqb.icu/guijie/comments/commentRss/216423.html</wfw:commentRss><trackback:ping>//www.pppqb.icu/guijie/services/trackbacks/216423.html</trackback:ping><description><![CDATA[很多ơ,电脑微信传送文ӞLQ蓝屏,qq传送文件就没有问题?a >https://www.zhihu.com/question/265609875</a><img src ="//www.pppqb.icu/guijie/aggbug/216423.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="//www.pppqb.icu/guijie/" target="_blank">杰哥</a> 2019-06-19 00:01 <a href="//www.pppqb.icu/guijie/archive/2019/06/19/216423.html#Feedback" target="_blank" style="text-decoration:none;">发表评论</a></div>]]></description></item><item><title>公式~辑器如何写写字母A,B,C - սƵ2019|սع//www.pppqb.icu/guijie/archive/2019/05/29/216382.html杰哥杰哥Wed, 29 May 2019 01:33:00 GMT//www.pppqb.icu/guijie/archive/2019/05/29/216382.html//www.pppqb.icu/guijie/comments/216382.html//www.pppqb.icu/guijie/archive/2019/05/29/216382.html#Feedback0//www.pppqb.icu/guijie/comments/commentRss/216382.html//www.pppqb.icu/guijie/services/trackbacks/216382.htmlhttps://zhidao.baidu.com/question/98221541.html

 

我用?/span>公式~辑?/span>是MathTypeQ进?/span>公式~辑?/span>后点菜单“~辑”Q里面有命o“插入W号(S)”Q进d查看“字体”改ؓ“Euclid Math One”Q里面就有花写字母A,B,CQ选择点击“嵌入”按钮可以了。不清楚?#8220;MicroSoft 公式 3.0”是不是一栗仅供参考?/span>



杰哥 2019-05-29 09:33 发表评论
]]>
h学术影响力排?2019 Scholar Metrics) - սƵ2019|սع//www.pppqb.icu/guijie/archive/2019/05/17/216377.html杰哥杰哥Fri, 17 May 2019 14:59:00 GMT//www.pppqb.icu/guijie/archive/2019/05/17/216377.html//www.pppqb.icu/guijie/comments/216377.html//www.pppqb.icu/guijie/archive/2019/05/17/216377.html#Feedback0//www.pppqb.icu/guijie/comments/commentRss/216377.html//www.pppqb.icu/guijie/services/trackbacks/216377.html2019q谷歌学者Top Publication榜单更新QCVPR上升到整个工E与计算机科学类W二名。h工智能领域NeurIPS, ICLR, ICML独占鳌头。自然语a处理榜单没有变化Q还是ACLQEMNLPQNAACL?​​​​
Reference:
https://scholar.google.com/citations?view_op=top_venues&hl=en&vq=eng
Weibo of Weilian Wang on July 20, 2019. See my favourite on July 23, 2019. 

https://m.sohu.com/a/245182179_473283/?pvid=000115_3w_a
 

2018h学术影响力排名出炉:CVPRq入?0QResNet被引最多过万次Q?/div>


来源Q?/span>scholar.google.com

作者:闻菲

【新智元D】谷歌学术昨天发表了2018q最新的学术期刊和会议媄响力排名Q?/span>CVPR?/span>NIPS分别排名W?/span>20和第54。在排名W一?/span>Nature里,q去5q被引用ơ数最高的论文Q正是深度学习三大神Hinton?/span>LeCun?/span>Bengio写的《深度学习》一文,?/span>CVPR里被引次数最高的Q则?/span>ResNetQ引用次数超q了1万次?/span>

昨天Q谷歌学术(Google ScholarQ公布了2018q最新的学术期刊/会议影响力排名,从综合领域看Q毫不意外的Q?/span>NatureW一?/span>ScienceW三Q但值得x的是Q?strong>计算觉顶?/strong>CVPR排名W?/span>20Q另一?/span>AI领域的顶?/span>NIPS也排名第54Q?/span>名次较去q有了大q提升?/span>

p排名W一?/span>Nature里,q去5q被引用ơ数最高的论文Q也?/span>深度学习三大?/span>”Hinton?/span>LeCun?/span>Bengio合著的《深度学习》一文?/span>

不仅如此Q在CVPR里,q去5q被引次数最多的论文Q是当时q在微Y亚洲研究院的孙剑、何恺明、张雨、Q卿写的?/span>ResNetQ被引次数已l过万?/span>

2018 h学术期刊和会议媄响力排名Q?/span>CVPRW?/span>20Q?/span>NIPSW?/span>54

首先来看l合领域l果?/span>

大家比较兛_?/span>Nature?/span>Science分别位列W一和第三,d著名期刊《新英格兰杂志》和《柳叶刀》分别位于第二和W四。一向被国内?/span>Nature?/span>Scienceq列Q有“CNS”之称?/span>CellQ这ơ排名第6?/span>

接下来就是新智元的读者更为关注的与h工智能有关的期刊和会议了Q这一ơ,计算觉顶?/span>CVPR不负众望排名W?/span>20Q由此计机领域会也终于进?/span>Top20的行列?/span>

 

另一斚wQ?/span>AI领域另一个备受关注的会议NIPSQ也在综合排名中位列W?/span>54Q取得了不错的成l?/span>

与神l科学相关的 Nature Neuroscience 排名W?/span>44?/span>

 

至于W?/span>21名到W?/span>40名的期刊Q实际上也有常有?/span>AI相关的论文发表,大家也可以看一下排名?/span>

值得一提,PLoS ONE位于W?/span>23Q?/span>Scientific Reports 排名W?/span>39Q也是不错的发表场所了?/span>

 

在第61到第80名中_集中出现了多?/span>IEEE期刊。被誉ؓ另一个计机视觉会?/span>ICCVQ排名第78?/span>

 

W?/span>81到第100名的期刊/会议排名如下Q?/span>TPAMI 位于W?/span>92Q果然好论文都优先去会议发表了?/span>

 

工程与计机领域Top 20Q?/span>CVPR排名W?/span>5

 

h学术计量排名ҎQ过?/span>5q被引用论文“h5指数

h学术Q?/span>Google ScholarQ期刊和会议排名主要Zh-index?/span>实际上,?/span>2012qv来,h学术计量Q?/span>Google Scholar Metrics, GSMQ每q都会发布学术期刊和会议?/span>GSM排名?/span>

相比U睿唯安ZWeb of Science数据库公布的《期刊引证报告》(Journal Citation Report, JCRQ,GSM不仅可以免费索,而且收录的期刊和会议范围q远大于Web of Science?/span>

q有一点,期刊/会议?/span>“h5指数Q过?/span>5q?/span>h-indexQ?/span>比较难以被h为操控,不会因ؓ多了一超高被引论文而明昑֢长,另一斚wQ刻意减发文量也不会对提升h5指数有作用?/span>

因此Q?/span>h5指数可以体现期刊和会议的整体l合实力Q逐渐成ؓ学术出版物和会议影响力评L一个重要参考?/span>

M看,GSM主要参考以?/span>3个指标:

相应圎ͼh5指数Q?/span>h5-indexQ?/span>h5核心Q?/span>h5-coreQ和h5中|h5-medianQ,是收录在谷歌学术系l中的期刊和会议在最q?/span>5q的论文数量及各论文被引用的ơ数?/span>

例如Q如果某本期刊在q去5q所发表的论文中Q至有 h 论文分别被引用了至?/span> h ơ,那么q䆾杂志?/span> h5指数是 h?/span>h5核心?/span>h5中值的计算Ҏ也一栗?/span>

了解更多Q?/span>

https://scholar.google.com/citations?view_op=top_venues&hl=zh-CN&vq=en

开售!

//www.aiworld2018.com/

h计算?/span>

声明Q该文观点仅代表作者本人,搜狐L信息发布q_Q搜狐仅提供信息存储I间服务?/span>



杰哥 2019-05-17 22:59 发表评论
]]>arxiv上传latex文章 - սƵ2019|սع//www.pppqb.icu/guijie/archive/2019/04/07/216341.html杰哥杰哥Sun, 07 Apr 2019 00:49:00 GMT//www.pppqb.icu/guijie/archive/2019/04/07/216341.html//www.pppqb.icu/guijie/comments/216341.html//www.pppqb.icu/guijie/archive/2019/04/07/216341.html#Feedback0//www.pppqb.icu/guijie/comments/commentRss/216341.html//www.pppqb.icu/guijie/services/trackbacks/216341.html
Ҏ20190405 flagged email,上传两个paper到arxiv. 先压~成zip文g,再Upload file, 发现会提C如下错?
contained a.bib file, but no a.bbl file (include a.bbl, or submit without a.bib; and remember to verify references).
在上传的文g中,a.bib删除卛_Q不要在原始文g中删Q这样可能会弄丢文gQ因为有时还是要在本机编译的?/div>https://arxiv.org/help/submit#availability


[zz] arxiv上传latex文章的方法与?br />如果惛_arxiv上挂出文章,通常可以挂pdf与latex两种格式的,如果pdf是由latex生成的话Q一般只能上传latex源文Ӟ不支持pdf的上传?/div>
arxiv上上传latex主要包括以下几个部分Q尤其上传文件线上编译一步具有一些坑需要注意?/div>
W一步:注册账号Q填写学校后~邮箱Q免d能的上传权限审核Q?/div>
W二臛_步:填写一些基本信息与讄Q参考网上的图:
新徏提交Q?/div>
填写信息
W七步,上传文gQ很重要Q涉及到是否能编译成功?/div>
我们ȝ电脑上的latex~译完的文g夹一般是q样的:
里面一大堆东西Q重要的是3个,一个源文gQ?tex的文Ӟ一个是与文件名同名的bbl文gQ?bbl文gQ还有就是文章中使用到的各种囄Q包括jpgQpdf{各cd像文Ӟ?/div>
另外一点值得注意Q各cd片文件不能以文g夹Ş式上传,只能一个文件一个文件的传,比如上图需要将figures文g夹的囄一个个上传Q如果里面还有文件夹Ql打开上传Q传完后如下图所C:
需要注意,此时U上~译pȝ在编?tex文g的时候,.tex里面索引囄的\径都是需要在最外层路径的,因ؓ囄是攑֜最外层的,然而线?tex里面的图片烦引ؓ了方便,一般都有好几次目录Q比如上图,.tex里面用到囄的\径一般至要加上"figures/xxx.jpg"Q如果\径不修改直接上传的线?tex文gQ线上编译则找不到figures文g夹而编译不了,所以针ҎQ需要将U下.tex里面的所有涉及到囄索引位置的地方全部改ZU烦引,即直接烦引图片名字,比如xxx.jpg?/div>
Ҏ一下,ȝ?tex文g可能如图Q?/div>
CU上׃没有figures文g夹,所以对应的囄目录必须LQ改成下面这P
针对原始.tex中存在的所有烦引图片,都需要改成一U烦引线上才能编译通过?/div>
~译成功后,填一下基本输出消息,比如titleQauthorQabstractQcomments{等Qsubmit卛_Qsubmit之前可以预览生成的pdfQ如果线上编译完的格式符合预期即可发表?/div>
?/div>
--------------------- 
作者:我i 
来源QCSDN 
原文Qhttps://blog.csdn.net/on2way/article/details/85940768 
版权声明Q本文ؓ博主原创文章Q{载请附上博文链接Q?/div>

杰哥 2019-04-07 08:49 发表评论
]]>[zz] How to Train a GAN? Tips and tricks to make GANs work - սƵ2019|սع//www.pppqb.icu/guijie/archive/2019/04/02/216325.html杰哥杰哥Mon, 01 Apr 2019 21:42:00 GMT//www.pppqb.icu/guijie/archive/2019/04/02/216325.html//www.pppqb.icu/guijie/comments/216325.html//www.pppqb.icu/guijie/archive/2019/04/02/216325.html#Feedback0//www.pppqb.icu/guijie/comments/commentRss/216325.html//www.pppqb.icu/guijie/services/trackbacks/216325.htmlWhile research in Generative Adversarial Networks (GANs) continues to improve the fundamental stability of these models, we use a bunch of tricks to train them and make them stable day to day.

Here are a summary of some of the tricks.

Here's a link to the authors of this document

If you find a trick that is particularly useful in practice, please open a Pull Request to add it to the document. If we find it to be reasonable and verified, we will merge it in.

1. Normalize the inputs

  • normalize the images between -1 and 1
  • Tanh as the last layer of the generator output

2: A modified loss function

In GAN papers, the loss function to optimize G is min (log 1-D), but in practice folks practically use max log D

  • because the first formulation has vanishing gradients early on
  • Goodfellow et. al (2014)

In practice, works well:

  • Flip labels when training generator: real = fake, fake = real

3: Use a spherical Z

  • Dont sample from a Uniform distribution

cube

  • Sample from a gaussian distribution

sphere

4: BatchNorm

  • Construct different mini-batches for real and fake, i.e. each mini-batch needs to contain only all real images or all generated images.
  • when batchnorm is not an option use instance normalization (for each sample, subtract mean and divide by standard deviation).

batchmix

5: Avoid Sparse Gradients: ReLU, MaxPool

  • the stability of the GAN game suffers if you have sparse gradients
  • LeakyReLU = good (in both G and D)
  • For Downsampling, use: Average Pooling, Conv2d + stride
  • For Upsampling, use: PixelShuffle, ConvTranspose2d + stride

6: Use Soft and Noisy Labels

  • Label Smoothing, i.e. if you have two target labels: Real=1 and Fake=0, then for each incoming sample, if it is real, then replace the label with a random number between 0.7 and 1.2, and if it is a fake sample, replace it with 0.0 and 0.3 (for example).
    • Salimans et. al. 2016
  • make the labels the noisy for the discriminator: occasionally flip the labels when training the discriminator

7: DCGAN / Hybrid Models

  • Use DCGAN when you can. It works!
  • if you cant use DCGANs and no model is stable, use a hybrid model : KL + GAN or VAE + GAN

8: Use stability tricks from RL

  • Experience Replay
    • Keep a replay buffer of past generations and occassionally show them
    • Keep checkpoints from the past of G and D and occassionaly swap them out for a few iterations
  • All stability tricks that work for deep deterministic policy gradients
  • See Pfau & Vinyals (2016)

9: Use the ADAM Optimizer

  • optim.Adam rules!
    • See Radford et. al. 2015
  • Use SGD for discriminator and ADAM for generator

10: Track failures early

  • D loss goes to 0: failure mode
  • check norms of gradients: if they are over 100 things are screwing up
  • when things are working, D loss has low variance and goes down over time vs having huge variance and spiking
  • if loss of generator steadily decreases, then it's fooling D with garbage (says martin)

11: Dont balance loss via statistics (unless you have a good reason to)

  • Dont try to find a (number of G / number of D) schedule to uncollapse training
  • It's hard and we've all tried it.
  • If you do try it, have a principled approach to it, rather than intuition

For example

while lossD > A:   train D while lossG > B:   train G 

12: If you have labels, use them

  • if you have labels available, training the discriminator to also classify the samples: auxillary GANs

13: Add noise to inputs, decay over time

14: [notsure] Train discriminator more (sometimes)

  • especially when you have noise
  • hard to find a schedule of number of D iterations vs G iterations

15: [notsure] Batch Discrimination

  • Mixed results

16: Discrete variables in Conditional GANs

  • Use an Embedding layer
  • Add as additional channels to images
  • Keep embedding dimensionality low and upsample to match image channel size

17: Use Dropouts in G in both train and test phase

Authors

  • Soumith Chintala
  • Emily Denton
  • Martin Arjovsky
  • Michael Mathieu
Reference:
https://github.com/soumith/ganhacks#authors



ս4ǿ:GAN的一些小trick

最q训lGAN遇到了很多坑QGAN的训l的是个很dt的问题,如果只是用别人的paper跑一些应用还好,如果自己设计新的l构Q做一些新的研I的话,需要了解这些trick了,都是泪~

q个doc soumith/ganhackssoumith/ganhacks 直是GAN武林界的九阴真经Q看完以后感觉自׃了一个level?/p>

自己做个W记Q?/p>

1。normalize输入Q让它在[-1,1]。generater的输出用tanhQ也是[-1,1]Q这对应v来了?/p>

2。论文里面optimize G是min log(1 - D)Q但在实际训l的时候可以用 max log(D)

3。对于噪声zQ别用均匀QuniformQ分布,用高斯分布?/p>

4。可以用instance norm代替 batch norm。还有就是real放一Pgenerated放一P感觉q个是废话QAQQ?/p>

5。避免稀疏的gradientsQRELUQMaxpool那些。这一Ҏ认ؓ原因是不像做辨别式的|络Q判别式的,可能提取重要的信息Q其实一些对预测影响不大的信息都被忽略掉了。但是GAN不同Q是生成式的模型Q所以要可能的表现出细节方面的内容Q所以避免用稀疏的q些Q?/p>

  • LeakyRelu
  • For Downsampling, use: Average Pooling, Conv2d + stride
  • For Upsampling, use: PixelShuffle, ConvTranspose2d + stride

6。可以把label?的(realQ变?.7~1.2Qlabel?的变?~0.3。这个可以深入想惟?/p>

7。能用DCGANqQ用不了的话用؜合模型,KL+GANQVAE+GAN之类的?/p>

8。借用RL训练技巧?/p>

  • Keep a replay buffer of past generations and occassionally show them
  • Keep checkpoints from the past of G and D and occassionaly swap them out for a few iterations

9。用ADAMQ或者是D可以用SGDQG用ADAM

10。注意训l过E,早发现训练p|Q不至于训练好长旉最后才发现Q浪Ҏ间?/p>

11。最好别试讄一些常量去balance G与D的训l过E。(他们说这个work很难做。我觉得有时间的话其实还是可以试一下的。)

12。如果你对real有相应的labelQ用labelQAC-GAN。加入label信息Q可以降低生成的隑ֺQ这个应该可以想的通?/p>

13。加噪声Q作用是improve生成内容得diversity?

  • Add some artificial noise to inputs to D (Arjovsky et. al., Huszar, 2016)
  • adding gaussian noise to every layer of generator (Zhao et. al. EBGAN)

14。【not sure】多训练DQ特别是加噪声的时候?/p>

15。【not sure】batch DQ感觉貌似是和pix2pix中的patchGAN有点像?

16。CGANQ我一直觉得CGANq种才符合hcd习的思\。原始的GAN太_暴了,好像什么都不知道,然后两个人D与G讨论交流ҎQ生的都是一些前人没有做q的工作Q开的工作Q所以比较困难一些,但是CGAN的话有了一定的前提Q也是技术积累,所以比较简单一些。有点类似科研中的大牛挖坑,开辟新方向QGANQ。小牛填坑(CGANQ?/p>

17。在G中的几层中用dropoutQ?0%Q。这个有一论文,q没看?/p>

dq些感觉自己惌设计GAN的话Q应该有个系l的认识了,不会觉得自己好像有哪些重要的地方q不知道Q很不踏实感觉。这U感觉对我这U强q症的感觉很不爽啊!Q看完以后顿时舒服了很多~~~

https://zhuanlan.zhihu.com/p/27725664

杰哥 2019-04-02 05:42 发表评论
]]>
logits - սƵ2019|սع//www.pppqb.icu/guijie/archive/2019/03/27/216316.html杰哥杰哥Tue, 26 Mar 2019 21:01:00 GMT//www.pppqb.icu/guijie/archive/2019/03/27/216316.html//www.pppqb.icu/guijie/comments/216316.html//www.pppqb.icu/guijie/archive/2019/03/27/216316.html#Feedback0//www.pppqb.icu/guijie/comments/commentRss/216316.html//www.pppqb.icu/guijie/services/trackbacks/216316.htmlWhat is the meaning of the word logits in TensorFlow?
In the following TensorFlow function, we must feed the activation of artificial neurons in the final layer. That I understand. But I don't understand why it is called logits? Isn't that a mathematical function?
loss_function = tf.nn.softmax_cross_entropy_with_logits(
     logits = last_layer,
     labels = target_output
)

For example, in the last layer of the discriminator of generative adversarial networks (GAN), we will use sigmoid(logits) to get the output of D. This is discussed with Zhengxia.
Reference:
https://stackoverflow.com/questions/41455101/what-is-the-meaning-of-the-word-logits-in-tensorflow


սƵ2019 2019-03-27 05:01 发表评论
]]>