Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow named tuple syntax for case class constructors. #22400

Draft
wants to merge 16 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 14 additions & 1 deletion compiler/src/dotty/tools/dotc/ast/Desugar.scala
Original file line number Diff line number Diff line change
Expand Up @@ -1717,7 +1717,20 @@ object desugar {
cpy.Tuple(tree)(elemValues)
val names = elems.collect:
case NamedArg(name, arg) => name
if names.isEmpty || ctx.mode.is(Mode.Pattern) then
val targetClassType = pt.underlyingClassRef(refinementOK = true)
val targetClass = targetClassType.typeSymbol
val isCaseClassConstr =
ctx.mode.isExpr
&& targetClass.is(Case)
&& !defn.isTupleClass(targetClass)
&& targetClass != defn.TupleXXLClass
&& names.hasSameLengthAs(elems) // true iff all arguments are named
if isCaseClassConstr then
import tpd.TreeOps
val companion = targetClass.companionModule.termRef
.withPrefix(targetClassType.asInstanceOf[TypeRef].prefix).asInstanceOf[TermRef]
Apply(TypedSplice(tpd.ref(companion).select(nme.apply)), elems).withSpan(tree.span)
else if names.isEmpty || ctx.mode.is(Mode.Pattern) then
tup
else
def namesTuple = withModeBits(ctx.mode &~ Mode.Pattern | Mode.Type):
Expand Down
1 change: 1 addition & 0 deletions compiler/src/dotty/tools/dotc/config/Feature.scala
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ object Feature:
val betterMatchTypeExtractors = experimental("betterMatchTypeExtractors")
val quotedPatternsWithPolymorphicFunctions = experimental("quotedPatternsWithPolymorphicFunctions")
val betterFors = experimental("betterFors")
val collectionLiterals = experimental("collectionLiterals")

def experimentalAutoEnableFeatures(using Context): List[TermName] =
defn.languageExperimentalFeatures
Expand Down
4 changes: 3 additions & 1 deletion compiler/src/dotty/tools/dotc/core/Definitions.scala
Original file line number Diff line number Diff line change
Expand Up @@ -563,7 +563,7 @@ class Definitions {
@tu lazy val Seq_lengthCompare: Symbol = SeqClass.requiredMethod(nme.lengthCompare, List(IntType))
@tu lazy val Seq_length : Symbol = SeqClass.requiredMethod(nme.length)
@tu lazy val Seq_toSeq : Symbol = SeqClass.requiredMethod(nme.toSeq)

@tu lazy val MapModule: Symbol = requiredModule("scala.collection.immutable.Map")

@tu lazy val StringOps: Symbol = requiredClass("scala.collection.StringOps")
@tu lazy val StringOps_format: Symbol = StringOps.requiredMethod(nme.format)
Expand All @@ -582,6 +582,8 @@ class Definitions {
@tu lazy val IArrayModule: Symbol = requiredModule("scala.IArray")
def IArrayModuleClass: Symbol = IArrayModule.moduleClass

@tu lazy val ExpressibleAsCollectionLiteralClass: ClassSymbol = requiredClass("scala.compiletime.ExpressibleAsCollectionLiteral")

@tu lazy val UnitType: TypeRef = valueTypeRef("scala.Unit", java.lang.Void.TYPE, UnitEnc, nme.specializedTypeNames.Void)
def UnitClass(using Context): ClassSymbol = UnitType.symbol.asClass
def UnitModuleClass(using Context): Symbol = UnitType.symbol.asClass.linkedClass
Expand Down
1 change: 1 addition & 0 deletions compiler/src/dotty/tools/dotc/core/StdNames.scala
Original file line number Diff line number Diff line change
Expand Up @@ -500,6 +500,7 @@ object StdNames {
val foreach: N = "foreach"
val format: N = "format"
val fromDigits: N = "fromDigits"
val fromLiteral: N = "fromLiteral"
val fromProduct: N = "fromProduct"
val genericArrayOps: N = "genericArrayOps"
val genericClass: N = "genericClass"
Expand Down
13 changes: 12 additions & 1 deletion compiler/src/dotty/tools/dotc/parsing/Parsers.scala
Original file line number Diff line number Diff line change
Expand Up @@ -2390,7 +2390,7 @@ object Parsers {
in.token match
case IMPLICIT =>
closure(start, location, modifiers(BitSet(IMPLICIT)))
case LBRACKET =>
case LBRACKET if followingIsArrow() =>
val start = in.offset
val tparams = typeParamClause(ParamOwner.Type)
val arrowOffset = accept(ARROW)
Expand Down Expand Up @@ -2710,6 +2710,7 @@ object Parsers {
* | xmlLiteral
* | SimpleRef
* | `(` [ExprsInParens] `)`
* | `[` ExprsInBrackets `]`
* | SimpleExpr `.` id
* | SimpleExpr `.` MatchClause
* | SimpleExpr (TypeArgs | NamedTypeArgs)
Expand Down Expand Up @@ -2745,6 +2746,10 @@ object Parsers {
case LBRACE | INDENT =>
canApply = false
blockExpr()
case LBRACKET if in.featureEnabled(Feature.collectionLiterals) =>
atSpan(in.offset):
inBrackets:
SeqLiteral(exprsInBrackets(), TypeTree())
case QUOTE =>
quote(location.inPattern)
case NEW =>
Expand Down Expand Up @@ -2840,6 +2845,12 @@ object Parsers {
commaSeparatedRest(exprOrBinding(), exprOrBinding)
}

/** ExprsInBrackets ::= ExprInParens {`,' ExprInParens} */
def exprsInBrackets(): List[Tree] =
if in.token == RBRACKET then Nil
else in.currentRegion.withCommasExpected:
commaSeparatedRest(exprInParens(), exprInParens)

/** ParArgumentExprs ::= `(' [‘using’] [ExprsInParens] `)'
* | `(' [ExprsInParens `,'] PostfixExpr `*' ')'
*/
Expand Down
2 changes: 1 addition & 1 deletion compiler/src/dotty/tools/dotc/transform/Erasure.scala
Original file line number Diff line number Diff line change
Expand Up @@ -891,7 +891,7 @@ object Erasure {
// The following four methods take as the proto-type the erasure of the pre-existing type,
// if the original proto-type is not a value type.
// This makes all branches be adapted to the correct type.
override def typedSeqLiteral(tree: untpd.SeqLiteral, pt: Type)(using Context): SeqLiteral =
override def typedSeqLiteral(tree: untpd.SeqLiteral, pt: Type)(using Context): Tree =
super.typedSeqLiteral(tree, erasure(tree.typeOpt))
// proto type of typed seq literal is original type;

Expand Down
35 changes: 22 additions & 13 deletions compiler/src/dotty/tools/dotc/typer/Implicits.scala
Original file line number Diff line number Diff line change
Expand Up @@ -616,6 +616,28 @@ object Implicits:
def msg(using Context): Message =
em"${errors.map(_.msg).mkString("\n")}"
}

private def isUnderSpecifiedArgument(tp: Type)(using Context): Boolean =
tp.isRef(defn.NothingClass) || tp.isRef(defn.NullClass) || (tp eq NoPrefix)

/** Is `tp` not specific enough to warrant an implicit search for it?
* This is the case for
* - `?`, `Any`, `AnyRef`,
* - conversions from a bottom type, or to an underspecified type, or to `Unit
* - bounded wildcard types with underspecified upper bound
* The method is usually called after transforming a type with `wildApprox`,
* which means that type variables with underspecified upper constraints are also
* underspecified.
*/
def isUnderspecified(tp: Type)(using Context): Boolean = tp.stripTypeVar match
case tp: WildcardType =>
!tp.optBounds.exists || isUnderspecified(tp.optBounds.hiBound)
case tp: ViewProto =>
isUnderspecified(tp.resType)
|| tp.resType.isRef(defn.UnitClass)
|| isUnderSpecifiedArgument(tp.argType.widen)
case _ =>
tp.isAny || tp.isAnyRef
end Implicits

import Implicits.*
Expand Down Expand Up @@ -1649,19 +1671,6 @@ trait Implicits:
res
end searchImplicit

def isUnderSpecifiedArgument(tp: Type): Boolean =
tp.isRef(defn.NothingClass) || tp.isRef(defn.NullClass) || (tp eq NoPrefix)

private def isUnderspecified(tp: Type): Boolean = tp.stripTypeVar match
case tp: WildcardType =>
!tp.optBounds.exists || isUnderspecified(tp.optBounds.hiBound)
case tp: ViewProto =>
isUnderspecified(tp.resType)
|| tp.resType.isRef(defn.UnitClass)
|| isUnderSpecifiedArgument(tp.argType.widen)
case _ =>
tp.isAny || tp.isAnyRef

/** Search implicit in context `ctxImplicits` or else in implicit scope
* of expected type if `ctxImplicits == null`.
*/
Expand Down
3 changes: 1 addition & 2 deletions compiler/src/dotty/tools/dotc/typer/Inferencing.scala
Original file line number Diff line number Diff line change
Expand Up @@ -58,8 +58,7 @@ object Inferencing {
* The method is called to instantiate type variables before an implicit search.
*/
def instantiateSelected(tp: Type, tvars: List[Type])(using Context): Unit =
if (tvars.nonEmpty)
IsFullyDefinedAccumulator(
IsFullyDefinedAccumulator(
new ForceDegree.Value(IfBottom.flip):
override def appliesTo(tvar: TypeVar) = tvars.contains(tvar),
minimizeSelected = true
Expand Down
68 changes: 60 additions & 8 deletions compiler/src/dotty/tools/dotc/typer/Typer.scala
Original file line number Diff line number Diff line change
Expand Up @@ -2394,7 +2394,7 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer
Annotation(defn.RequiresCapabilityAnnot, cap, tree.span))))
res.withNotNullInfo(expr1.notNullInfo.terminatedInfo)

def typedSeqLiteral(tree: untpd.SeqLiteral, pt: Type)(using Context): SeqLiteral = {
def typedSeqLiteral(tree: untpd.SeqLiteral, pt: Type)(using Context): Tree =
val elemProto = pt.stripNull().elemType match {
case NoType => WildcardType
case bounds: TypeBounds => WildcardType(bounds)
Expand All @@ -2404,24 +2404,76 @@ class Typer(@constructorOnly nestingLevel: Int = 0) extends Namer
def assign(elems1: List[Tree], elemtpt1: Tree) =
assignType(cpy.SeqLiteral(tree)(elems1, elemtpt1), elems1, elemtpt1)

if (!tree.elemtpt.isEmpty) {
// Seq literal used in varargs: elem type is given
def varargSeqLiteral =
val elemtpt1 = typed(tree.elemtpt, elemProto)
val elems1 = tree.elems.mapconserve(typed(_, elemtpt1.tpe))
assign(elems1, elemtpt1)
}
else {

// Seq literal used in Java annotations: elem type needs to be computed
def javaSeqLiteral =
val elems1 = tree.elems.mapconserve(typed(_, elemProto))
val elemtptType =
if (isFullyDefined(elemProto, ForceDegree.none))
if isFullyDefined(elemProto, ForceDegree.none) then
elemProto
else if (tree.elems.isEmpty && tree.isInstanceOf[Trees.JavaSeqLiteral[?]])
else if tree.elems.isEmpty then
defn.ObjectType // generic empty Java varargs are of type Object[]
else
TypeComparer.lub(elems1.tpes)
val elemtpt1 = typed(tree.elemtpt, elemtptType)
assign(elems1, elemtpt1)
}
}

// Stand-alone collection literal [x1, ..., xN]
def collectionLiteral =
def isArrow(tree: untpd.Tree) = tree match
case untpd.InfixOp(_, Ident(nme.PUREARROW), _) => true
case _ => false

// The default maker if no typeclass is searched or found
def defaultMaker =
if tree.elems.nonEmpty && tree.elems.forall(isArrow)
then untpd.ref(defn.MapModule)
else untpd.ref(defn.SeqModule)

// We construct and typecheck a term `maker(tree.elems)`, where `maker`
// is either a given instance of type ExpressibleAsCollectionLiteralClass
// or a default instance. The default instance is either Seq or Map,
// depending on the forms of `tree.elems`. We search for a type class if
// the expected type is a value type that is not underspeficied for implicit search.
val maker = pt match
case pt: ValueType if !Implicits.isUnderspecified(wildApprox(pt)) =>
val tc = defn.ExpressibleAsCollectionLiteralClass.typeRef.appliedTo(pt)
val nestedCtx = ctx.fresh.setNewTyperState()
// Find given instance `witness` of type `ExpressibleAsCollectionLiteral[<pt>]`
val witness = inferImplicitArg(tc, tree.span.startPos)
def errMsg = missingArgMsg(witness, pt, "")
typr.println(i"infer for $tree with $tc = $witness, ${ctx.typerState.constraint}")
witness.tpe match
case _: AmbiguousImplicits =>
report.error(errMsg, tree.srcPos)
defaultMaker
case _: SearchFailureType =>
typr.println(i"failed collection literal witness: ${errMsg.toString}")
defaultMaker
case _ =>
// Continue with typing `witness.fromLiteral` as the constructor
untpd.TypedSplice(witness.select(nme.fromLiteral))
case _ =>
defaultMaker
// When expected type is a Seq or Array, propagate the `elemProto` as expected
// type of the elements.
val elems = elemProto match
case WildcardType(_) => tree.elems
case _ => tree.elems.map(untpd.Typed(_, untpd.TypeTree(elemProto)))
typed(
untpd.Apply(maker, elems).withSpan(tree.span)
.showing(i"typed collection literal $tree ---> $result", typr)
, pt)

if !tree.elemtpt.isEmpty then varargSeqLiteral
else if tree.isInstanceOf[Trees.JavaSeqLiteral[?]] then javaSeqLiteral
else collectionLiteral
end typedSeqLiteral

def typedInlined(tree: untpd.Inlined, pt: Type)(using Context): Tree =
throw new UnsupportedOperationException("cannot type check a Inlined node")
Expand Down
1 change: 1 addition & 0 deletions docs/_docs/internals/syntax.md
Original file line number Diff line number Diff line change
Expand Up @@ -278,6 +278,7 @@ SimpleExpr ::= SimpleRef
| ‘new’ ConstrApp {‘with’ ConstrApp} [TemplateBody] New(constr | templ)
| ‘new’ TemplateBody
| ‘(’ ExprsInParens ‘)’ Parens(exprs)
| ‘[’ ExprInParens {‘,’ ExprInParens} ‘]’ SeqLiteral(exprs, TypeTree())
| SimpleExpr ‘.’ id Select(expr, id)
| SimpleExpr ‘.’ MatchClause
| SimpleExpr TypeArgs TypeApply(expr, args)
Expand Down
96 changes: 96 additions & 0 deletions docs/_docs/reference/experimental/collection-literals.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
---
layout: doc-page
title: "Collection Literals"
redirectFrom: /docs/reference/other-new-features/collection-literals.html
nightlyOf: https://docs.scala-lang.org/scala3/reference/experimental/collection-literals.html
---


Support for collection literals is enabled by the experimental language import
```scala
import scala.language.experimental.collectionLiterals
```
This feature requires a source version 3.7 or higher. One can specify both import and source version on the command line with these settings:
```
-source 3.7 -language:experimental.collectionLiterals
```
Collection literals are comma-separated sequences of expressions, like these:
```scala
val oneTwoThree = [1, 2, 3]
val anotherLit = [math.Pi, math.cos(2.0), math.E * 3.0]
val diag = [[1, 0, 0], [0, 1, 0], [0, 0, 1]]
val empty = []
val mapy = [1 -> "one", 2 -> "two", 3 -> "three"]
```
The type of a collection literal depends on the expected type. If there is no expected type (as in the examples above) a collection literal is of type `Seq`, except if it consists exclusively elements of the form `a -> b`, then it is of type `Map`. For instance, the literals above would
get inferred types as follows.
```scala
val oneTwoThree: Seq[Int] = [1, 2, 3]
val anotherLit: Seq[Double] = [math.Pi, math.cos(2.0), math.E * 3.0]
val diag: Seq[Seq[Int]] = [[1, 0, 0], [0, 1, 0], [0, 0, 1]]
val empty: Seq[Nothing] = []
val mapy: Map[Int, String] = [1 -> "one", 2 -> "two", 3 -> "three"]
```
If there is an expected type `E`, the compiler will search for a given instance of the
type class `ExpressibleAsCollectionLiteral[E]`. This type class is defined in package `scala.compiletime` as follows:
```scala
trait ExpressibleAsCollectionLiteral[+Coll]:

/** The element type of the created collection */
type Elem

/** The inline method that creates the collection */
inline def fromLiteral(inline xs: Elem*): Coll
```
If a best matching instance `ecl` is found, its `fromLiteral` method is used to convert
the elements of the literal to the expected type. If the search is ambiguous, it will be
reported as an error. If no matching instance is found, the literal will be typed by the default scheme as if there was no expected type.

The companion object of `ExpressibleAsCollectionLiteral` contains a number of given instances for standard collection types. For instance, there is:
```scala
given vectorFromLiteral: [T] => ExpressibleAsCollectionLiteral[Vector[T]]:
type Elem = T
inline def fromLiteral(inline xs: T*) = Vector[Elem](xs*)
```
Hence, the definition
```scala
val v: Vector[Int] = [1, 2, 3]
```
would be expanded by the compiler to
```scala
val v: Vector[Int] = vectorFromLiteral.fromLiteral(1, 2, 3)
```
After inlining, this produces
```scala
val v: Vector[Int] = Vector[Int](1, 2, 3)
```
Using this scheme, the literals we have seen earlier could also be given alternative types like these:
```scala
val oneTwoThree: Vector[Int] = [1, 2, 3]
val anotherLit: IArray[Double] = [math.Pi, math.cos(2.0), math.E * 3.0]
val diag: Array[Array[Int]] = [[1, 0, 0], [0, 1, 0], [0, 0, 1]]
val empty: ArrayBuffer[Object] = []
val mapy: HashMap[Int, String] = [1 -> "one", 2 -> "two", 3 -> "three"]
```

**Notes**

- Since the fromLiteral method in `ExpressibleAsCollectionLiteral` is an inline method with inline arguments, given instances can implement it as a macro.

- The precise meaning of "is there an expected type?" is as follows: There is no expected
type if the expected type known from the context is _under-specified_, as it is defined for
implicit search. That is, an implicit search for a given of the type would not be
attempted because the type is not specific enough. Concretely, this is the case for Wildcard types `?`, `Any`, `AnyRef`, unconstrained type variables, or type variables constrained from above by an under-specified type.

- If the expected type is a subtype of `Seq` or an array type, we typecheck the
elements with the elements of the expected type. This means we can get the same
precision in propagated expected types as if the constructor was written explicitly.
Hence, we can't regress by going from `Seq(...)` or `Array(...)` to a
collection literal.

**Syntax**

```
SimpleExpr ::= ...
| ‘[’ ExprInParens {‘,’ ExprInParens} ‘]’
```
1 change: 1 addition & 0 deletions docs/sidebar.yml
Original file line number Diff line number Diff line change
Expand Up @@ -163,6 +163,7 @@ subsection:
- page: reference/experimental/typeclasses.md
- page: reference/experimental/runtimeChecked.md
- page: reference/experimental/better-fors.md
- page: reference/experimental/collection-literals.md
- page: reference/syntax.md
- title: Language Versions
index: reference/language-versions/language-versions.md
Expand Down
Loading
Loading