Data clustering:50years beyond k-means翻译(8)

2019-03-22 10:25

[99]Newman, M., Girvan, M., 2004. Finding and evaluating community structure in

networks. Phys. Rev. E 69 (026113), 3.

[100]Ng, Andrew Y., Jordan, Michael I., Weiss, Yair, 2001. On spectral clustering:

Analysis and an algorithm. Adv. Neural Inform. Process. Systems, vol. 14. MIT Press, pp. 849–856.

[101]Pampalk, Elias, Dixon, Simon, Widmer, Gerhard, 2003. On the evaluation of

perceptual similarity measures for music. In: Proc. Sixth Internat. Conf. on Digital Audio Effects (DAFx-03). pp. 7–12.

[102]Pavan, Massimilliano, Pelillo, Marcello, 2007. Dominant sets and pairwise

clustering. IEEE Trans. Pattern Anal. Machine Intell. 29 (1), 167–172.

[103]Pelleg Dan, Moore Andrew, 1999. Accelerating exact k-means algorithms with

geometric reasoning. In: Chaudhuri Surajit, Madigan David (Eds.), Proc. Fifth Internat. Conf. on Knowledge Discovery in Databases, AAAI Press, pp. 277–281.

[104]Pelleg, Dan, Moore, Andrew, 2000. X-means: Extending k-means with efficient

estimation of the number of clusters. In: Proc. Seventeenth Internat. Conf. on Machine Learning. pp. 727–734.

[105]Philbin, J., Chum, O., Isard, M., Sivic, J., Zisserman, A., 2007. Object retrieval

with large vocabularies and fast spatial matching. In: Proc. IEEE Conf. on Computer Vision and Pattern Recognition.

[106]Rasmussen, Carl, 2000. The infinite gaussian mixture model. Adv. Neural

Inform. Process. Systems 12, 554–560. [107]Roberts

Stephen

J.,

Holmes,

Christopher,

Denison,

Dave,

2001.

Minimum-entropy data clustering using reversible jump Markov chain Monte Carlo. In: Proc. Internat. Conf. Artificial Neural Networks. pp. 103–110. [108]Sahami, Mehran, 1998. Using Machine Learning to Improve Information Access.

Ph.D. Thesis, Computer Science Department, Stanford University.

Sammon Jr., J.W., 1969. A nonlinear mapping for data structure analysis. IEEE Trans.

Comput. 18, 401–409.

[109]Scholkopf, Bernhard, Smola, Alexander, Muller, Klaus-Robert, 1998. Nonlinear

component analysis as a kernel eigenvalue problem. Neural Comput. 10 (5), 1299–1319.

[110]Shamir, Ohad, Tishby, Naftali, 2008. Cluster stability for finite samples. Adv.

Neural Inform. Process. Systems 20, 1297–1304.

[111]Shi, Jianbo, Malik, Jitendra, 2000. Normalized cuts and image segmentation.

IEEE Trans. Pattern Anal. Machine Intell. 22, 888–905.

[112]Sindhwani, V., Hu, J., Mojsilovic, A., 2008. Regularized co-clustering with dual

supervision. In: Advances in Neural Information Processing Systems.

[113]Slonim, Noam, Tishby, Naftali, 2000. Document clustering using word clusters

via the information bottleneck method. In: ACM SIGIR 2000, pp. 208–215. [114]Smith, Stephen P., Jain, Anil K., 1984. Testing for uniformity in

multidimensional data. IEEE Trans. Pattern Anal. Machine Intell. 6 (1), 73–81. [115]Sokal, Robert R., Sneath, Peter H.A., 1963. Principles of Numerical Taxonomy.

W.H. Freeman, San Francisco.

[116]Steinbach, M., Karypis, G., Kumar, V., 2000. A comparison of document

clustering techniques. In: KDD Workshop on Text Mining.

[117]Steinbach, Michael, Tan, Pang-Ning, Kumar, Vipin, Klooster, Steve, Potter,

Christopher, 2003. Discovery of climate indices using clustering. In: Proc. Ninth ACM SIGKDD Internat. Conf. on Knowledge Discovery and Data Mining. [118]Steinhaus, H., 1956. Sur la division des corp materiels en parties. Bull. Acad.

Polon. Sci. IV (C1.III), 801–804.

[119]Strehl, Alexander, Ghosh, Joydeep, 2003. Cluster ensembles – A knowledge

reuse framework for combining multiple partitions. J. Machine Learn. Res. 3, 583–617.

[120]Tabachnick, B.G., Fidell, L.S., 2007. Using Multivariate Statistics, fifth ed. Allyn

and Bacon, Boston.

[121]Tan, Pang-Ning, Steinbach, Michael, Kumar, Vipin, 2005. Introduction to Data

Mining, first ed. Addison-Wesley Longman Publishing Co. Inc., Boston, MA, USA.

Taskar, B., Segal, E., Koller, D., 2001.

[122]Probabilistic clustering in relational data. In: Proc. Seventeenth Internat. Joint

Conf. on Artificial Intelligence (IJCAI), pp. 870– 887.

[123]Tibshirani, R., Walther, G., Hastie, T., 2001. Estimating the number of clusters in

a data set via the gap statistic. J. Roy. Statist. Soc. B, 411–423.

[124]Tishby, Naftali, Pereira, Fernando C., Bialek, William, 1999. The information

bottleneck method. In: Proc. 37th Allerton Conf. on Communication, Control and Computing, pp. 368–377.

[125]Tsuda, Koji, Kudo, Taku, 2006. Clustering graphs by weighted substructure

mining. In: Proc. 23rd Internat. Conf. on Machine Learning. pp. 953–960. [126]Tukey, John Wilder, 1977. Exploratory Data Analysis. Addison-Wesley. [127]Umeyama, S., 1988. An eigen decomposition approach to weighted graph

matching problems. IEEE Trans. Pattern Anal. Machine Intell. 10 (5), 695–703. [128]von Luxburg, U., David, Ben S., 2005. Towards a statistical theory of clustering.

In: Pascal Workshop on Statistics and Optimization of Clustering.

[129]Wallace, C.S., Boulton, D.M., 1968. An information measure for classification.

Comput. J. 11, 185–195.

[130]Wallace, C.S., Freeman, P.R., 1987. Estimation and inference by compact coding

(with discussions). JRSSB 49, 240–251.

[131]Wasserman, S., Faust, K., 1994. Social Network Analysis: Methods and

Applications. Cambridge University Press.

[132]Welling, M., Rosen-Zvi, M., Hinton, G., 2005. Exponential family harmoniums

with an application to information retrieval. Adv. Neural Inform. Process. Systems 17, 1481–1488.

[133]White, Scott, Smyth, Padhraic, 2005. A spectral clustering approach to finding

communities in graph. In: Proc. SIAM Data Mining.

[134]Yu, Stella X., Shi, Jianbo, 2003. Multiclass spectral clustering. In: Proc. Internat.

Conf. on Computer Vision, pp. 313–319.

[135]Zhang, Tian, Ramakrishnan, Raghu, Livny, Miron. 1996. BIRCH: An efficient

data clustering method for very large databases. In: Proc.1996 ACM SIGMOD Internat. Conf. on Management of data, vol. 25, pp. 103–114.


Data clustering:50years beyond k-means翻译(8).doc 将本文的Word文档下载到电脑 下载失败或者文档不完整,请联系客服人员解决!

下一篇:高频电路备课笔记

相关阅读
本类排行
× 注册会员免费下载(下载后可以自由复制和排版)

马上注册会员

注:下载文档有可能“只有目录或者内容不全”等情况,请下载之前注意辨别,如果您已付费且无法下载或内容有问题,请联系我们协助你处理。
微信: QQ: