Scala 访问权限控制——Scala Access Modifiers
2014-01-21 20:35
162 查看
其他的都和Java的差不多,唯一的区别是多了Scope protect
object. Consider the following example:
Note the following points from the above example:
Variable workDetails will be accessible to any class within the enclosing packageprofessional.
Variable friends will be accessible to any class within the enclosing packagesociety.
Variable secrets will be accessible only on the implicit object within instance methods (this).
Spark中到处都可以看到 Scope of protection, 例如以下代码段
Scope of protection
Access modifiers in Scala can be augmented with qualifiers. A modifier of the form private[X] or protected[X] means that access is private or protected "up to" X, where X designates some enclosing package, class or singletonobject. Consider the following example:
package society { package professional { class Executive { private[professional] var workDetails = null private[society] var friends = null private[this] var secrets = null def help(another : Executive) { println(another.workDetails) println(another.secrets) //ERROR } } } }
Note the following points from the above example:
Variable workDetails will be accessible to any class within the enclosing packageprofessional.
Variable friends will be accessible to any class within the enclosing packagesociety.
Variable secrets will be accessible only on the implicit object within instance methods (this).
Spark中到处都可以看到 Scope of protection, 例如以下代码段
// Create the Spark execution environment (cache, map output tracker, etc) private[spark] val env = SparkEnv.create( conf, "<driver>", conf.get("spark.driver.host"), conf.get("spark.driver.port").toInt, isDriver = true, isLocal = isLocal) SparkEnv.set(env) // Used to store a URL for each static file/jar together with the file's local timestamp private[spark] val addedFiles = HashMap[String, Long]() private[spark] val addedJars = HashMap[String, Long]() // Keeps track of all persisted RDDs private[spark] val persistentRdds = new TimeStampedHashMap[Int, RDD[_]] private[spark] val metadataCleaner = new MetadataCleaner(MetadataCleanerType.SPARK_CONTEXT, this.cleanup, conf) // Initialize the Spark UI private[spark] val ui = new SparkUI(this)
相关文章推荐
- Scala 访问权限控制——Scala Access Modifiers
- 访问权限控制 单例模式
- Java访问控制权限
- C# 管理类的访问权限控制
- Dt大数据梦工厂王家林老师 Scala实战详解之第17讲 Scala中包、类、对象、成员、伴生类、伴生对象访问权限
- spring security 采用角色控制访问权限
- Subversion之路--实现精细的目录访问权限控制
- java访问控制权限和C++访问控制权限的对比
- 网络层访问权限控制技术-ACL详解 (2)
- mongodb 访问权限控制
- 访问权限控制
- SQLServer控制用户访问权限表
- java编程思想读书笔记 第六章访问权限控制
- 通过虚函数表调用虚函数与通过虚函数表(绕过访问权限控制)
- spring mvc使用自定义注解控制访问权限
- MongoDB 3.0 安全权限访问控制
- ceph存储 网络层访问权限控制技术-acl(访问控制列表)
- SQL Server 2005控制用户权限访问表
- YIi 权限管理和基于角色的访问控制
- Java编程思想,读书笔记四(第6章 访问权限控制)