Apr 17 15:14:37.865182 ip-10-0-131-29 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 15:14:37.865199 ip-10-0-131-29 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 15:14:37.865209 ip-10-0-131-29 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 15:14:37.865516 ip-10-0-131-29 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 15:14:47.972159 ip-10-0-131-29 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 15:14:47.972174 ip-10-0-131-29 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 333e7c8eb9e943ef9d215e09ccc1a932 -- Apr 17 15:17:15.341317 ip-10-0-131-29 systemd[1]: Starting Kubernetes Kubelet... Apr 17 15:17:15.792352 ip-10-0-131-29 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:15.792352 ip-10-0-131-29 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 15:17:15.792352 ip-10-0-131-29 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:15.792352 ip-10-0-131-29 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 15:17:15.792811 ip-10-0-131-29 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:15.795252 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.795189 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 15:17:15.802488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802468 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:15.802488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802484 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:15.802488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802488 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:15.802488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802491 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:15.802488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802494 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802498 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802501 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802503 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802506 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802509 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802511 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802514 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802516 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802519 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802521 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802524 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802526 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802528 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802531 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802533 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802535 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802538 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802540 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802543 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:15.802653 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802545 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802552 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802555 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802557 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802559 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802562 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802564 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802567 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802569 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802571 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802574 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802576 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802579 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802581 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802585 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802588 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802591 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802593 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802596 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:15.803145 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802600 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802603 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802605 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802608 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802610 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802612 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802616 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802620 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802623 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802625 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802628 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802630 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802633 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802636 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802638 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802640 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802643 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802645 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802648 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:15.803591 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802651 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802653 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802656 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802658 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802661 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802664 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802668 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802670 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802673 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802676 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802678 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802681 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802684 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802686 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802688 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802691 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802695 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802697 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802699 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802703 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:15.804059 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802707 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802710 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802713 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.802716 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803097 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803102 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803105 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803108 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803111 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803113 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803116 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803119 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803122 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803125 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803128 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803130 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803133 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803135 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803138 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803140 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:15.804590 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803143 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803145 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803148 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803151 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803153 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803155 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803158 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803160 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803163 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803166 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803168 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803170 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803173 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803175 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803177 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803180 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803183 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803186 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803188 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803190 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:15.805074 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803193 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803195 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803198 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803201 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803203 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803206 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803208 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803210 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803213 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803215 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803218 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803221 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803224 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803226 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803228 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803232 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803235 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803238 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803241 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:15.805580 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803244 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803246 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803249 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803252 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803255 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803257 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803260 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803262 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803265 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803268 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803270 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803272 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803274 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803277 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803279 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803282 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803284 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803287 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803290 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803293 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:15.806024 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803295 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803297 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803300 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803303 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803305 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803307 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803310 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803312 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803314 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803317 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.803319 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804071 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804080 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804086 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804091 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804097 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804100 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804104 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804108 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804111 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804114 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 15:17:15.806523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804118 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804121 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804124 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804127 2577 flags.go:64] FLAG: --cgroup-root="" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804129 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804132 2577 flags.go:64] FLAG: --client-ca-file="" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804135 2577 flags.go:64] FLAG: --cloud-config="" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804137 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804140 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804143 2577 flags.go:64] FLAG: --cluster-domain="" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804146 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804149 2577 flags.go:64] FLAG: --config-dir="" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804151 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804155 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804159 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804161 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804164 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804168 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804170 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804173 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804176 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804179 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804182 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804186 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804189 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 15:17:15.807018 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804192 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804194 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804198 2577 flags.go:64] FLAG: --enable-server="true" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804201 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804206 2577 flags.go:64] FLAG: --event-burst="100" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804209 2577 flags.go:64] FLAG: --event-qps="50" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804212 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804215 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804218 2577 flags.go:64] FLAG: --eviction-hard="" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804226 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804229 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804232 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804235 2577 flags.go:64] FLAG: --eviction-soft="" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804238 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804240 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804243 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804246 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804249 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804251 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804254 2577 flags.go:64] FLAG: --feature-gates="" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804258 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804261 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804264 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804267 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804270 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 17 15:17:15.807622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804273 2577 flags.go:64] FLAG: --help="false" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804275 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-131-29.ec2.internal" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804278 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804281 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804284 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804287 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804290 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804293 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804295 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804298 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804301 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804304 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804307 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804310 2577 flags.go:64] FLAG: --kube-reserved="" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804312 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804315 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804318 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804321 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804323 2577 flags.go:64] FLAG: --lock-file="" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804326 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804329 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804331 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804336 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 15:17:15.808221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804339 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804341 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804344 2577 flags.go:64] FLAG: --logging-format="text" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804346 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804350 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804353 2577 flags.go:64] FLAG: --manifest-url="" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804356 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804360 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804363 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804366 2577 flags.go:64] FLAG: --max-pods="110" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804369 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804372 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804375 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804378 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804380 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804383 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804386 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804393 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804396 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804398 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804402 2577 flags.go:64] FLAG: --pod-cidr="" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804405 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804412 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804415 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 15:17:15.808748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804418 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804421 2577 flags.go:64] FLAG: --port="10250" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804423 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804426 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01850aa30e4861e40" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804429 2577 flags.go:64] FLAG: --qos-reserved="" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804432 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804435 2577 flags.go:64] FLAG: --register-node="true" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804438 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804440 2577 flags.go:64] FLAG: --register-with-taints="" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804443 2577 flags.go:64] FLAG: --registry-burst="10" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804446 2577 flags.go:64] FLAG: --registry-qps="5" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804449 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804451 2577 flags.go:64] FLAG: --reserved-memory="" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804457 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804460 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804463 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804465 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804468 2577 flags.go:64] FLAG: --runonce="false" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804471 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804474 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804476 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804479 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804482 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804484 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804487 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804490 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 15:17:15.809326 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804493 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804495 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804498 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804501 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804505 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804508 2577 flags.go:64] FLAG: --system-cgroups="" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804511 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804515 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804518 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804521 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804524 2577 flags.go:64] FLAG: --tls-min-version="" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804527 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804530 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804532 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804535 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804538 2577 flags.go:64] FLAG: --v="2" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804541 2577 flags.go:64] FLAG: --version="false" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804545 2577 flags.go:64] FLAG: --vmodule="" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804549 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.804554 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804633 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804636 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804639 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804641 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:15.809933 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804643 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804646 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804648 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804651 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804654 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804656 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804658 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804661 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804663 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804665 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804668 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804670 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804674 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804676 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804679 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804681 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804684 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804686 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804689 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804691 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:15.810530 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804693 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804696 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804698 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804700 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804703 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804705 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804708 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804711 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804714 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804716 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804719 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804723 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804726 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804729 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804731 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804734 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804736 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804739 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804741 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:15.811004 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804743 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804746 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804748 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804750 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804752 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804756 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804759 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804761 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804763 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804766 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804768 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804770 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804773 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804775 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804777 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804780 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804782 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804785 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804787 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804789 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:15.811488 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804793 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804795 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804797 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804800 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804802 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804804 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804807 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804809 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804812 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804815 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804818 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804821 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804823 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804826 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804828 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804830 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804833 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804837 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804839 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:15.811974 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804842 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804844 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804846 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.804848 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.805793 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.811235 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.811248 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811289 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811296 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811300 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811303 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811306 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811309 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811311 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811314 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:15.812479 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811316 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811319 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811321 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811325 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811329 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811332 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811334 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811337 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811339 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811341 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811344 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811346 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811349 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811351 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811353 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811356 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811358 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811362 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811364 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:15.812845 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811367 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811369 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811372 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811374 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811377 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811380 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811382 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811385 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811387 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811390 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811392 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811394 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811397 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811399 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811401 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811404 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811406 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811409 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811411 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811413 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:15.813311 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811416 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811418 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811421 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811423 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811425 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811427 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811430 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811432 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811434 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811437 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811439 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811442 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811445 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811447 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811449 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811452 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811455 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811457 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811460 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811462 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:15.813818 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811465 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811467 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811470 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811472 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811474 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811477 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811479 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811482 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811484 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811486 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811488 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811491 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811493 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811496 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811498 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811501 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811503 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811506 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:15.814293 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811508 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.811513 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811613 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811617 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811620 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811623 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811626 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811629 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811631 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811634 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811639 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811654 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811658 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811662 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811664 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:15.814730 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811667 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811670 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811673 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811675 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811678 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811680 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811683 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811685 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811688 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811690 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811693 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811695 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811698 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811700 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811702 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811705 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811707 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811710 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811714 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:15.815125 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811717 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811719 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811722 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811724 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811726 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811729 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811732 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811734 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811737 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811739 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811742 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811744 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811747 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811749 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811751 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811754 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811756 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811758 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811761 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811763 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:15.815571 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811765 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811768 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811770 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811772 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811775 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811777 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811779 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811782 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811784 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811786 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811789 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811791 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811794 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811796 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811799 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811801 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811803 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811806 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811809 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811811 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:15.816133 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811814 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811816 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811819 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811821 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811823 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811826 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811828 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811830 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811832 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811835 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811837 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811839 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811842 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:15.811844 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.811848 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:15.816600 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.812584 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 15:17:15.816964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.814437 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 15:17:15.816964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.815337 2577 server.go:1019] "Starting client certificate rotation" Apr 17 15:17:15.816964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.815432 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 15:17:15.816964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.815471 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 15:17:15.840319 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.840304 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 15:17:15.844266 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.844249 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 15:17:15.858474 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.858455 2577 log.go:25] "Validated CRI v1 runtime API" Apr 17 15:17:15.863949 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.863935 2577 log.go:25] "Validated CRI v1 image API" Apr 17 15:17:15.866520 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.866499 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 15:17:15.872882 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.872864 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8240a72b-fd13-462d-b144-11ed5981b3ea:/dev/nvme0n1p4 ef7b4fa2-6d48-44c5-b2ed-31b184571c8b:/dev/nvme0n1p3] Apr 17 15:17:15.872951 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.872881 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 15:17:15.876553 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.876536 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 15:17:15.878108 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.877997 2577 manager.go:217] Machine: {Timestamp:2026-04-17 15:17:15.876206766 +0000 UTC m=+0.417871623 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199244 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec282ff0bc9101366930bff5df4576de SystemUUID:ec282ff0-bc91-0136-6930-bff5df4576de BootID:333e7c8e-b9e9-43ef-9d21-5e09ccc1a932 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:45:53:b7:2f:c3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:45:53:b7:2f:c3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:6e:5b:bf:bf:e5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 15:17:15.878558 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.878549 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 15:17:15.878721 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.878640 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 15:17:15.878972 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.878956 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 15:17:15.879116 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.878974 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-29.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 15:17:15.879157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.879122 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 15:17:15.879157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.879130 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 15:17:15.879157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.879146 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 15:17:15.879910 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.879901 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 15:17:15.881163 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.881154 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 15:17:15.881268 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.881260 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 15:17:15.884764 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.884755 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 17 15:17:15.884808 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.884772 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 15:17:15.884808 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.884789 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 15:17:15.884808 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.884798 2577 kubelet.go:397] "Adding apiserver pod source" Apr 17 15:17:15.884808 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.884806 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 15:17:15.885809 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.885795 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 15:17:15.885809 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.885811 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 15:17:15.888446 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.888428 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 15:17:15.889761 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.889748 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 15:17:15.891648 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891632 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 15:17:15.891705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891656 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 15:17:15.891705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891666 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 15:17:15.891705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891673 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 15:17:15.891705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891682 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 15:17:15.891705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891690 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 15:17:15.891705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891699 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 15:17:15.891859 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891709 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 15:17:15.891859 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891719 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 15:17:15.891859 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891728 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 15:17:15.891859 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891752 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 15:17:15.891859 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.891765 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 15:17:15.892729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.892719 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 15:17:15.892777 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.892731 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 15:17:15.893229 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.893208 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m6zlh" Apr 17 15:17:15.895940 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.895927 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 15:17:15.896009 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.895963 2577 server.go:1295] "Started kubelet" Apr 17 15:17:15.896090 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.896015 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 15:17:15.896144 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.896080 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 15:17:15.896144 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.896138 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 15:17:15.896549 ip-10-0-131-29 systemd[1]: Started Kubernetes Kubelet. Apr 17 15:17:15.896637 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.896605 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-29.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 15:17:15.897017 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.896892 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 15:17:15.897017 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.896892 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-29.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 15:17:15.898689 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.898674 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 17 15:17:15.899492 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.899470 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 15:17:15.901027 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.901007 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-m6zlh" Apr 17 15:17:15.904158 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.904140 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 15:17:15.904453 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.903482 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-29.ec2.internal.18a72de423b6fd62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-29.ec2.internal,UID:ip-10-0-131-29.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-29.ec2.internal,},FirstTimestamp:2026-04-17 15:17:15.895938402 +0000 UTC m=+0.437603258,LastTimestamp:2026-04-17 15:17:15.895938402 +0000 UTC m=+0.437603258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-29.ec2.internal,}" Apr 17 15:17:15.904769 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.904758 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 15:17:15.905297 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.905282 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 15:17:15.907394 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.907282 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 15:17:15.907473 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.907403 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 15:17:15.907620 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.907580 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 17 15:17:15.907620 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.907592 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 17 15:17:15.907796 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.907781 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:15.908194 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.908134 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 15:17:15.910118 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910100 2577 factory.go:153] Registering CRI-O factory Apr 17 15:17:15.910191 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910172 2577 factory.go:223] Registration of the crio container factory successfully Apr 17 15:17:15.910237 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910225 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 15:17:15.910270 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910240 2577 factory.go:55] Registering systemd factory Apr 17 15:17:15.910270 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910249 2577 factory.go:223] Registration of the systemd container factory successfully Apr 17 15:17:15.910270 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910268 2577 factory.go:103] Registering Raw factory Apr 17 15:17:15.910358 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910277 2577 manager.go:1196] Started watching for new ooms in manager Apr 17 15:17:15.910622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.910607 2577 manager.go:319] Starting recovery of all containers Apr 17 15:17:15.915051 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.915009 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:15.918363 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.918337 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-29.ec2.internal\" not found" node="ip-10-0-131-29.ec2.internal" Apr 17 15:17:15.919941 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.919925 2577 manager.go:324] Recovery completed Apr 17 15:17:15.923642 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.923632 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:15.925886 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.925873 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:15.925938 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.925899 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:15.925938 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.925911 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:15.926345 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.926333 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 15:17:15.926345 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.926344 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 15:17:15.926424 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.926358 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 15:17:15.928589 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.928577 2577 policy_none.go:49] "None policy: Start" Apr 17 15:17:15.928632 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.928594 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 15:17:15.928632 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.928603 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.967338 2577 manager.go:341] "Starting Device Plugin manager" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.967363 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.967373 2577 server.go:85] "Starting device plugin registration server" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.967580 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.967607 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.967692 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.967759 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:15.967766 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.968311 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 15:17:15.981675 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:15.968342 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.055435 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.055377 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 15:17:16.056606 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.056591 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 15:17:16.056696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.056616 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 15:17:16.056696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.056630 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 15:17:16.056696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.056637 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 15:17:16.056696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.056665 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 15:17:16.058737 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.058723 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:16.068068 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.068054 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:16.068821 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.068808 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:16.068894 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.068831 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:16.068894 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.068840 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:16.068894 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.068858 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.074163 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.074149 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.074227 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.074167 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-29.ec2.internal\": node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.085025 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.085010 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.157451 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.157398 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal"] Apr 17 15:17:16.157536 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.157493 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:16.158257 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.158233 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:16.158335 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.158264 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:16.158335 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.158277 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:16.160423 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.160410 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:16.160547 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.160534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.160596 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.160567 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:16.161020 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.161004 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:16.161110 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.161048 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:16.161110 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.161004 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:16.161110 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.161063 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:16.161110 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.161075 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:16.161110 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.161087 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:16.163156 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.163142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.163223 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.163165 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:16.163763 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.163749 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:16.163815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.163773 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:16.163815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.163782 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:16.185940 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.185920 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.191569 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.191555 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-29.ec2.internal\" not found" node="ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.195773 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.195759 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-29.ec2.internal\" not found" node="ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.210155 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.210131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15b0834749502abab65480e68ca33cd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal\" (UID: \"15b0834749502abab65480e68ca33cd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.210245 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.210164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15b0834749502abab65480e68ca33cd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal\" (UID: \"15b0834749502abab65480e68ca33cd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.210245 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.210188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e1f1a3ec57b3b66a90c747576fdf8e1-config\") pod \"kube-apiserver-proxy-ip-10-0-131-29.ec2.internal\" (UID: \"1e1f1a3ec57b3b66a90c747576fdf8e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.286329 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.286307 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.310845 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.310781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15b0834749502abab65480e68ca33cd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal\" (UID: \"15b0834749502abab65480e68ca33cd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.310845 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.310822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15b0834749502abab65480e68ca33cd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal\" (UID: \"15b0834749502abab65480e68ca33cd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.310964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.310847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e1f1a3ec57b3b66a90c747576fdf8e1-config\") pod \"kube-apiserver-proxy-ip-10-0-131-29.ec2.internal\" (UID: \"1e1f1a3ec57b3b66a90c747576fdf8e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.310964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.310880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15b0834749502abab65480e68ca33cd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal\" (UID: \"15b0834749502abab65480e68ca33cd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.310964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.310892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e1f1a3ec57b3b66a90c747576fdf8e1-config\") pod \"kube-apiserver-proxy-ip-10-0-131-29.ec2.internal\" (UID: \"1e1f1a3ec57b3b66a90c747576fdf8e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.310964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.310888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15b0834749502abab65480e68ca33cd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal\" (UID: \"15b0834749502abab65480e68ca33cd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.387072 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.387052 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.487683 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.487658 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.493815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.493802 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.498106 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.498090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.588174 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.588130 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.688715 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.688694 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.733782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.733762 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:16.789619 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.789600 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-29.ec2.internal\" not found" Apr 17 15:17:16.815150 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.815127 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 15:17:16.815566 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.815221 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 15:17:16.815566 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.815249 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 15:17:16.815566 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.815269 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 15:17:16.827109 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.827091 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:16.885807 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.885765 2577 apiserver.go:52] "Watching apiserver" Apr 17 15:17:16.895917 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.895897 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 15:17:16.896920 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.896898 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-j8tkm","openshift-network-diagnostics/network-check-target-xz6xx","kube-system/konnectivity-agent-ljhqf","openshift-image-registry/node-ca-cx2v6","openshift-multus/network-metrics-daemon-j7zl6","openshift-network-operator/iptables-alerter-xc7q5","openshift-ovn-kubernetes/ovnkube-node-6m42k","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m","openshift-cluster-node-tuning-operator/tuned-9v7gz","openshift-dns/node-resolver-8hx9p","openshift-multus/multus-additional-cni-plugins-7jdw8"] Apr 17 15:17:16.901542 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.901527 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.902563 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.902528 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 15:12:15 +0000 UTC" deadline="2028-01-11 05:43:01.956333294 +0000 UTC" Apr 17 15:17:16.902632 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.902563 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15206h25m45.053773039s" Apr 17 15:17:16.903471 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.903455 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 15:17:16.903617 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.903599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:16.903712 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.903660 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:16.903778 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.903712 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 15:17:16.903882 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.903871 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 15:17:16.903924 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.903910 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 15:17:16.903962 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.903930 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q5mqj\"" Apr 17 15:17:16.904856 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.904840 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 15:17:16.905465 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.905441 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.905759 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.905734 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:16.907499 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.907484 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 15:17:16.907569 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.907499 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 15:17:16.907569 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.907528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7xl9g\"" Apr 17 15:17:16.907832 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.907820 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:16.907940 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.907924 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:16.908020 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:16.907993 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:16.910174 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.910155 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 15:17:16.910628 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.910270 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 15:17:16.910628 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.910285 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 15:17:16.910628 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.910271 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-f72dc\"" Apr 17 15:17:16.912597 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.912580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:16.913157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.913134 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 15:17:16.913252 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.913203 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" Apr 17 15:17:16.914347 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/879a1087-ad81-4931-a7fc-1d30c4f2539d-serviceca\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:16.914443 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/859c9d28-c95a-461d-841e-f476f3fb6fb7-cni-binary-copy\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914443 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-hostroot\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914443 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-daemon-config\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914443 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914396 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-multus-certs\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-netns\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-system-cni-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-socket-dir-parent\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-cni-bin\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/131fc208-4c4f-4581-b543-a9f317f71657-konnectivity-ca\") pod \"konnectivity-agent-ljhqf\" (UID: \"131fc208-4c4f-4581-b543-a9f317f71657\") " pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84pjt\" (UniqueName: \"kubernetes.io/projected/4445020e-d73c-4a2d-9f40-1c3fc286490e-kube-api-access-84pjt\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a55a9866-dc90-424d-aefb-be85c6ce02cb-iptables-alerter-script\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:16.914651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-os-release\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-kubelet\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-etc-kubernetes\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-cnibin\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914784 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv897\" (UniqueName: \"kubernetes.io/projected/859c9d28-c95a-461d-841e-f476f3fb6fb7-kube-api-access-rv897\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwctl\" (UniqueName: \"kubernetes.io/projected/a55a9866-dc90-424d-aefb-be85c6ce02cb-kube-api-access-kwctl\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914892 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914897 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/879a1087-ad81-4931-a7fc-1d30c4f2539d-host\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-cni-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.914981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-k8s-cni-cncf-io\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-cni-multus\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-conf-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:16.915575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:16.915575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/131fc208-4c4f-4581-b543-a9f317f71657-agent-certs\") pod \"konnectivity-agent-ljhqf\" (UID: \"131fc208-4c4f-4581-b543-a9f317f71657\") " pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:16.915575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjzl5\" (UniqueName: \"kubernetes.io/projected/879a1087-ad81-4931-a7fc-1d30c4f2539d-kube-api-access-vjzl5\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:16.915575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lstsb\"" Apr 17 15:17:16.915575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a55a9866-dc90-424d-aefb-be85c6ce02cb-host-slash\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:16.915899 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.915740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:16.918097 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.917835 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 15:17:16.918097 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918014 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 15:17:16.918260 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 15:17:16.918260 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918188 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 15:17:16.918375 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918261 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pr6fw\"" Apr 17 15:17:16.918375 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918279 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 15:17:16.918375 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 15:17:16.918696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918680 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 15:17:16.918988 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.918976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:16.920734 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.920718 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 15:17:16.920811 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.920725 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 15:17:16.920904 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.920890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 15:17:16.920952 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.920943 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xm52c\"" Apr 17 15:17:16.921114 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.921102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:16.922328 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.922315 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 15:17:16.922961 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.922943 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:17:16.923067 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.922973 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 15:17:16.923139 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.923101 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mwlrr\"" Apr 17 15:17:16.923567 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.923554 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:16.925356 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.925337 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 15:17:16.925455 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.925395 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 15:17:16.925455 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.925412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-znk6p\"" Apr 17 15:17:16.926153 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.926137 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal"] Apr 17 15:17:16.926223 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.926160 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal"] Apr 17 15:17:16.926266 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.926237 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:16.928260 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.928244 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 15:17:16.928511 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.928496 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 15:17:16.928562 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.928537 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kpt76\"" Apr 17 15:17:16.939866 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.939849 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-964bd" Apr 17 15:17:16.947133 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:16.947119 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-964bd" Apr 17 15:17:17.009000 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.008980 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 15:17:17.015341 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-cnibin\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.015421 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:17.015421 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwctl\" (UniqueName: \"kubernetes.io/projected/a55a9866-dc90-424d-aefb-be85c6ce02cb-kube-api-access-kwctl\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:17.015421 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-cnibin\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.015532 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.015532 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-var-lib-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.015532 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-node-log\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.015750 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015543 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-log-socket\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.015750 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.015592 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:17.015750 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-registration-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.015750 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-cni-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.015750 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-cni-multus\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.015750 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-conf-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.015750 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.015772 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:17:17.515740657 +0000 UTC m=+2.057405505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-cni-multus\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-ovn\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-cni-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-hosts-file\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-device-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a55a9866-dc90-424d-aefb-be85c6ce02cb-host-slash\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-lib-modules\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.015984 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-conf-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-etc-selinux\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.016087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a55a9866-dc90-424d-aefb-be85c6ce02cb-host-slash\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-kubernetes\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-daemon-config\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016161 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-systemd\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016235 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016255 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frq8b\" (UniqueName: \"kubernetes.io/projected/32385a04-2774-4f48-af2e-36e3bb20d368-kube-api-access-frq8b\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-tmp\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-socket-dir-parent\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/131fc208-4c4f-4581-b543-a9f317f71657-konnectivity-ca\") pod \"konnectivity-agent-ljhqf\" (UID: \"131fc208-4c4f-4581-b543-a9f317f71657\") " pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-cni-netd\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysconfig\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016394 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-socket-dir-parent\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-var-lib-kubelet\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-host\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-os-release\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.016644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-kubelet\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-modprobe-d\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv897\" (UniqueName: \"kubernetes.io/projected/859c9d28-c95a-461d-841e-f476f3fb6fb7-kube-api-access-rv897\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016542 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-os-release\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-system-cni-dir\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovnkube-script-lib\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-systemd\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/879a1087-ad81-4931-a7fc-1d30c4f2539d-host\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/859c9d28-c95a-461d-841e-f476f3fb6fb7-multus-daemon-config\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-k8s-cni-cncf-io\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-run-netns\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-run-ovn-kubernetes\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-k8s-cni-cncf-io\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/131fc208-4c4f-4581-b543-a9f317f71657-agent-certs\") pod \"konnectivity-agent-ljhqf\" (UID: \"131fc208-4c4f-4581-b543-a9f317f71657\") " pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/879a1087-ad81-4931-a7fc-1d30c4f2539d-host\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjzl5\" (UniqueName: \"kubernetes.io/projected/879a1087-ad81-4931-a7fc-1d30c4f2539d-kube-api-access-vjzl5\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/131fc208-4c4f-4581-b543-a9f317f71657-konnectivity-ca\") pod \"konnectivity-agent-ljhqf\" (UID: \"131fc208-4c4f-4581-b543-a9f317f71657\") " pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:17.017247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovnkube-config\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-env-overrides\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-tmp-dir\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.016999 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-sys-fs\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysctl-conf\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/879a1087-ad81-4931-a7fc-1d30c4f2539d-serviceca\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cnibin\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-systemd-units\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovn-node-metrics-cert\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysctl-d\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-run\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-sys\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-tuned\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/859c9d28-c95a-461d-841e-f476f3fb6fb7-cni-binary-copy\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-hostroot\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-multus-certs\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.017797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dzt\" (UniqueName: \"kubernetes.io/projected/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-kube-api-access-v4dzt\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-hostroot\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-multus-certs\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-cni-bin\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/879a1087-ad81-4931-a7fc-1d30c4f2539d-serviceca\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-netns\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-os-release\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-run-netns\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/859c9d28-c95a-461d-841e-f476f3fb6fb7-cni-binary-copy\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-etc-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-socket-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-system-cni-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-cni-bin\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84pjt\" (UniqueName: \"kubernetes.io/projected/4445020e-d73c-4a2d-9f40-1c3fc286490e-kube-api-access-84pjt\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-cni-bin\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a55a9866-dc90-424d-aefb-be85c6ce02cb-iptables-alerter-script\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:17.018440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75ntc\" (UniqueName: \"kubernetes.io/projected/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-kube-api-access-75ntc\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfbz\" (UniqueName: \"kubernetes.io/projected/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-kube-api-access-stfbz\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-system-cni-dir\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-kubelet\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-host-var-lib-kubelet\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-etc-kubernetes\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cni-binary-copy\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.017999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/859c9d28-c95a-461d-841e-f476f3fb6fb7-etc-kubernetes\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.018057 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-slash\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.018087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppcpf\" (UniqueName: \"kubernetes.io/projected/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-kube-api-access-ppcpf\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.018883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.018356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a55a9866-dc90-424d-aefb-be85c6ce02cb-iptables-alerter-script\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:17.019622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.019608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/131fc208-4c4f-4581-b543-a9f317f71657-agent-certs\") pod \"konnectivity-agent-ljhqf\" (UID: \"131fc208-4c4f-4581-b543-a9f317f71657\") " pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:17.027918 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.027859 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:17.027918 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.027883 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:17.027918 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.027901 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zhg8b for pod openshift-network-diagnostics/network-check-target-xz6xx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:17.028207 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.028006 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b podName:2f57efa0-9b15-4e70-9d38-74a517201d53 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:17.527988794 +0000 UTC m=+2.069653642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zhg8b" (UniqueName: "kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b") pod "network-check-target-xz6xx" (UID: "2f57efa0-9b15-4e70-9d38-74a517201d53") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:17.029571 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.029549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwctl\" (UniqueName: \"kubernetes.io/projected/a55a9866-dc90-424d-aefb-be85c6ce02cb-kube-api-access-kwctl\") pod \"iptables-alerter-xc7q5\" (UID: \"a55a9866-dc90-424d-aefb-be85c6ce02cb\") " pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:17.030257 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.030239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv897\" (UniqueName: \"kubernetes.io/projected/859c9d28-c95a-461d-841e-f476f3fb6fb7-kube-api-access-rv897\") pod \"multus-j8tkm\" (UID: \"859c9d28-c95a-461d-841e-f476f3fb6fb7\") " pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.030257 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.030251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84pjt\" (UniqueName: \"kubernetes.io/projected/4445020e-d73c-4a2d-9f40-1c3fc286490e-kube-api-access-84pjt\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:17.030419 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.030403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjzl5\" (UniqueName: \"kubernetes.io/projected/879a1087-ad81-4931-a7fc-1d30c4f2539d-kube-api-access-vjzl5\") pod \"node-ca-cx2v6\" (UID: \"879a1087-ad81-4931-a7fc-1d30c4f2539d\") " pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:17.075869 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.075675 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15b0834749502abab65480e68ca33cd4.slice/crio-acdd557b8fbd87fd2664403b5f5c1cef23758692db36a084c5524aeb568c17ac WatchSource:0}: Error finding container acdd557b8fbd87fd2664403b5f5c1cef23758692db36a084c5524aeb568c17ac: Status 404 returned error can't find the container with id acdd557b8fbd87fd2664403b5f5c1cef23758692db36a084c5524aeb568c17ac Apr 17 15:17:17.079583 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.079570 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:17:17.118827 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-device-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.118827 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-lib-modules\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.118971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-etc-selinux\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.118971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-kubernetes\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.118971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.118971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-kubernetes\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.118971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-device-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.118971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-lib-modules\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.118990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-systemd\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-etc-selinux\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-systemd\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frq8b\" (UniqueName: \"kubernetes.io/projected/32385a04-2774-4f48-af2e-36e3bb20d368-kube-api-access-frq8b\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-tmp\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-cni-netd\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysconfig\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-var-lib-kubelet\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-host\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-kubelet\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-modprobe-d\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-system-cni-dir\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovnkube-script-lib\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-var-lib-kubelet\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-kubelet\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-systemd\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-system-cni-dir\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-host\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119333 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-systemd\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-modprobe-d\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-run-netns\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119439 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-run-ovn-kubernetes\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovnkube-config\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.119874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-env-overrides\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119464 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysconfig\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-cni-netd\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-tmp-dir\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-sys-fs\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-run-netns\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysctl-conf\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cnibin\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-sys-fs\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-run-ovn-kubernetes\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-systemd-units\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovn-node-metrics-cert\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cnibin\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysctl-d\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-systemd-units\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-run\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysctl-d\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovnkube-script-lib\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.120639 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119817 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-run\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-env-overrides\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-sys\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-sys\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-tuned\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dzt\" (UniqueName: \"kubernetes.io/projected/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-kube-api-access-v4dzt\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-cni-bin\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-os-release\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-etc-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-socket-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75ntc\" (UniqueName: \"kubernetes.io/projected/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-kube-api-access-75ntc\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stfbz\" (UniqueName: \"kubernetes.io/projected/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-kube-api-access-stfbz\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cni-binary-copy\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120238 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-etc-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-cni-bin\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-tmp-dir\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.121503 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.119956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-sysctl-conf\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-os-release\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-socket-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120559 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cni-binary-copy\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-slash\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppcpf\" (UniqueName: \"kubernetes.io/projected/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-kube-api-access-ppcpf\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120679 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-host-slash\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-var-lib-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-node-log\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-log-socket\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-var-lib-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-registration-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-log-socket\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-ovn\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122007 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-openvswitch\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-hosts-file\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32385a04-2774-4f48-af2e-36e3bb20d368-registration-dir\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.120990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-run-ovn\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.121000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-hosts-file\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.121055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-node-log\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.121166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovnkube-config\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.121184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.121684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-tmp\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.122235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-etc-tuned\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.122469 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.122360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-ovn-node-metrics-cert\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.138235 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.138172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frq8b\" (UniqueName: \"kubernetes.io/projected/32385a04-2774-4f48-af2e-36e3bb20d368-kube-api-access-frq8b\") pod \"aws-ebs-csi-driver-node-4x45m\" (UID: \"32385a04-2774-4f48-af2e-36e3bb20d368\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.138354 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.138339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dzt\" (UniqueName: \"kubernetes.io/projected/14d69252-8d9f-46ec-8e05-0b0a8f1b3b07-kube-api-access-v4dzt\") pod \"multus-additional-cni-plugins-7jdw8\" (UID: \"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07\") " pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.138396 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.138371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppcpf\" (UniqueName: \"kubernetes.io/projected/14c11c01-24f2-4908-a3cb-5c90f9ec8d35-kube-api-access-ppcpf\") pod \"node-resolver-8hx9p\" (UID: \"14c11c01-24f2-4908-a3cb-5c90f9ec8d35\") " pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.139100 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.139085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75ntc\" (UniqueName: \"kubernetes.io/projected/41a2e9d0-bfbe-47d5-9ccd-610cb5204675-kube-api-access-75ntc\") pod \"ovnkube-node-6m42k\" (UID: \"41a2e9d0-bfbe-47d5-9ccd-610cb5204675\") " pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.140179 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.140166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfbz\" (UniqueName: \"kubernetes.io/projected/7aa4aa82-d5f1-423a-b9ac-13669e2b1804-kube-api-access-stfbz\") pod \"tuned-9v7gz\" (UID: \"7aa4aa82-d5f1-423a-b9ac-13669e2b1804\") " pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.187744 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.187727 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e1f1a3ec57b3b66a90c747576fdf8e1.slice/crio-1fe9f67d48bda19f1f078619c677822f9b1450e4e51d2af590a8681413045dd8 WatchSource:0}: Error finding container 1fe9f67d48bda19f1f078619c677822f9b1450e4e51d2af590a8681413045dd8: Status 404 returned error can't find the container with id 1fe9f67d48bda19f1f078619c677822f9b1450e4e51d2af590a8681413045dd8 Apr 17 15:17:17.227075 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.227028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j8tkm" Apr 17 15:17:17.232132 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.232113 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod859c9d28_c95a_461d_841e_f476f3fb6fb7.slice/crio-5ebc43a37428bad9f3aede28af43219453bbfa33801a6e5e05e9ad48a1843874 WatchSource:0}: Error finding container 5ebc43a37428bad9f3aede28af43219453bbfa33801a6e5e05e9ad48a1843874: Status 404 returned error can't find the container with id 5ebc43a37428bad9f3aede28af43219453bbfa33801a6e5e05e9ad48a1843874 Apr 17 15:17:17.244845 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.244831 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:17.250399 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.250381 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131fc208_4c4f_4581_b543_a9f317f71657.slice/crio-fa9c82b73a0ffb81817c868b38304421bf5ab53f928b39d299ad2f0633c826fc WatchSource:0}: Error finding container fa9c82b73a0ffb81817c868b38304421bf5ab53f928b39d299ad2f0633c826fc: Status 404 returned error can't find the container with id fa9c82b73a0ffb81817c868b38304421bf5ab53f928b39d299ad2f0633c826fc Apr 17 15:17:17.272046 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.271993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cx2v6" Apr 17 15:17:17.276952 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.276934 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879a1087_ad81_4931_a7fc_1d30c4f2539d.slice/crio-ddcd8e923a1096800f650f3051d2f6e389ab8bbae3af44c7963e67a93b8e94a3 WatchSource:0}: Error finding container ddcd8e923a1096800f650f3051d2f6e389ab8bbae3af44c7963e67a93b8e94a3: Status 404 returned error can't find the container with id ddcd8e923a1096800f650f3051d2f6e389ab8bbae3af44c7963e67a93b8e94a3 Apr 17 15:17:17.286347 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.286335 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xc7q5" Apr 17 15:17:17.292515 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.292495 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda55a9866_dc90_424d_aefb_be85c6ce02cb.slice/crio-d2728450524701c7236cbb3e88b86db4b20dc01ba6bfbc775166c3f5fe941018 WatchSource:0}: Error finding container d2728450524701c7236cbb3e88b86db4b20dc01ba6bfbc775166c3f5fe941018: Status 404 returned error can't find the container with id d2728450524701c7236cbb3e88b86db4b20dc01ba6bfbc775166c3f5fe941018 Apr 17 15:17:17.299593 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.299578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:17.304213 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.304195 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a2e9d0_bfbe_47d5_9ccd_610cb5204675.slice/crio-03ed5b03336cb276429b6ed0f12502ab7e76b4da173e89fa411e63254072b955 WatchSource:0}: Error finding container 03ed5b03336cb276429b6ed0f12502ab7e76b4da173e89fa411e63254072b955: Status 404 returned error can't find the container with id 03ed5b03336cb276429b6ed0f12502ab7e76b4da173e89fa411e63254072b955 Apr 17 15:17:17.313262 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.313247 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" Apr 17 15:17:17.319109 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.319092 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32385a04_2774_4f48_af2e_36e3bb20d368.slice/crio-29a68eb8cedd34de9ce9b28cb1b06a19a41bf760d88107bfc8d08bb640a26998 WatchSource:0}: Error finding container 29a68eb8cedd34de9ce9b28cb1b06a19a41bf760d88107bfc8d08bb640a26998: Status 404 returned error can't find the container with id 29a68eb8cedd34de9ce9b28cb1b06a19a41bf760d88107bfc8d08bb640a26998 Apr 17 15:17:17.336410 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.336393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" Apr 17 15:17:17.341688 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.341667 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa4aa82_d5f1_423a_b9ac_13669e2b1804.slice/crio-00791f957625e8a67260c2ca50a43ed540b9d80b108bb4f9c211d29568c7a6fc WatchSource:0}: Error finding container 00791f957625e8a67260c2ca50a43ed540b9d80b108bb4f9c211d29568c7a6fc: Status 404 returned error can't find the container with id 00791f957625e8a67260c2ca50a43ed540b9d80b108bb4f9c211d29568c7a6fc Apr 17 15:17:17.359616 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.359598 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8hx9p" Apr 17 15:17:17.364086 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.364073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" Apr 17 15:17:17.364600 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.364574 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14c11c01_24f2_4908_a3cb_5c90f9ec8d35.slice/crio-c8f09750c18a88add0d71ed2bb1768e4fbbc2c944e136f40330fccba4037f1b0 WatchSource:0}: Error finding container c8f09750c18a88add0d71ed2bb1768e4fbbc2c944e136f40330fccba4037f1b0: Status 404 returned error can't find the container with id c8f09750c18a88add0d71ed2bb1768e4fbbc2c944e136f40330fccba4037f1b0 Apr 17 15:17:17.369264 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:17.369245 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d69252_8d9f_46ec_8e05_0b0a8f1b3b07.slice/crio-f5ece876d998ce7decb68bc8650363371b6613400111f83e51f68c583dd593e1 WatchSource:0}: Error finding container f5ece876d998ce7decb68bc8650363371b6613400111f83e51f68c583dd593e1: Status 404 returned error can't find the container with id f5ece876d998ce7decb68bc8650363371b6613400111f83e51f68c583dd593e1 Apr 17 15:17:17.524555 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.524510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:17.524648 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.524608 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:17.524704 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.524655 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:17:18.524638875 +0000 UTC m=+3.066303720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:17.625490 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.625460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:17.625607 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.625595 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:17.625676 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.625614 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:17.625676 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.625626 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zhg8b for pod openshift-network-diagnostics/network-check-target-xz6xx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:17.625771 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:17.625688 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b podName:2f57efa0-9b15-4e70-9d38-74a517201d53 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:18.625666176 +0000 UTC m=+3.167331028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhg8b" (UniqueName: "kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b") pod "network-check-target-xz6xx" (UID: "2f57efa0-9b15-4e70-9d38-74a517201d53") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:17.816679 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.816571 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:17.947885 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.947843 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 15:12:16 +0000 UTC" deadline="2027-10-02 10:02:59.239012916 +0000 UTC" Apr 17 15:17:17.948063 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:17.947898 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12786h45m41.29112007s" Apr 17 15:17:18.060298 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.060272 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:18.060447 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:18.060404 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:18.104471 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.104380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" event={"ID":"7aa4aa82-d5f1-423a-b9ac-13669e2b1804","Type":"ContainerStarted","Data":"00791f957625e8a67260c2ca50a43ed540b9d80b108bb4f9c211d29568c7a6fc"} Apr 17 15:17:18.132258 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.132227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" event={"ID":"32385a04-2774-4f48-af2e-36e3bb20d368","Type":"ContainerStarted","Data":"29a68eb8cedd34de9ce9b28cb1b06a19a41bf760d88107bfc8d08bb640a26998"} Apr 17 15:17:18.160892 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.160821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xc7q5" event={"ID":"a55a9866-dc90-424d-aefb-be85c6ce02cb","Type":"ContainerStarted","Data":"d2728450524701c7236cbb3e88b86db4b20dc01ba6bfbc775166c3f5fe941018"} Apr 17 15:17:18.168187 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.168148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cx2v6" event={"ID":"879a1087-ad81-4931-a7fc-1d30c4f2539d","Type":"ContainerStarted","Data":"ddcd8e923a1096800f650f3051d2f6e389ab8bbae3af44c7963e67a93b8e94a3"} Apr 17 15:17:18.174162 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.174019 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ljhqf" event={"ID":"131fc208-4c4f-4581-b543-a9f317f71657","Type":"ContainerStarted","Data":"fa9c82b73a0ffb81817c868b38304421bf5ab53f928b39d299ad2f0633c826fc"} Apr 17 15:17:18.176865 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.176835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" event={"ID":"15b0834749502abab65480e68ca33cd4","Type":"ContainerStarted","Data":"acdd557b8fbd87fd2664403b5f5c1cef23758692db36a084c5524aeb568c17ac"} Apr 17 15:17:18.181132 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.181085 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerStarted","Data":"f5ece876d998ce7decb68bc8650363371b6613400111f83e51f68c583dd593e1"} Apr 17 15:17:18.196372 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.196347 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8hx9p" event={"ID":"14c11c01-24f2-4908-a3cb-5c90f9ec8d35","Type":"ContainerStarted","Data":"c8f09750c18a88add0d71ed2bb1768e4fbbc2c944e136f40330fccba4037f1b0"} Apr 17 15:17:18.214899 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.214866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"03ed5b03336cb276429b6ed0f12502ab7e76b4da173e89fa411e63254072b955"} Apr 17 15:17:18.226358 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.226294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j8tkm" event={"ID":"859c9d28-c95a-461d-841e-f476f3fb6fb7","Type":"ContainerStarted","Data":"5ebc43a37428bad9f3aede28af43219453bbfa33801a6e5e05e9ad48a1843874"} Apr 17 15:17:18.228202 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.228180 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:18.251104 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.250933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" event={"ID":"1e1f1a3ec57b3b66a90c747576fdf8e1","Type":"ContainerStarted","Data":"1fe9f67d48bda19f1f078619c677822f9b1450e4e51d2af590a8681413045dd8"} Apr 17 15:17:18.430755 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.430676 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:18.536145 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.536114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:18.536328 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:18.536283 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:18.536394 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:18.536368 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:17:20.536351132 +0000 UTC m=+5.078015982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:18.637222 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.637186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:18.637367 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:18.637318 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:18.637367 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:18.637339 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:18.637367 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:18.637348 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zhg8b for pod openshift-network-diagnostics/network-check-target-xz6xx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:18.637473 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:18.637414 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b podName:2f57efa0-9b15-4e70-9d38-74a517201d53 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:20.637400999 +0000 UTC m=+5.179065843 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhg8b" (UniqueName: "kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b") pod "network-check-target-xz6xx" (UID: "2f57efa0-9b15-4e70-9d38-74a517201d53") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:18.949057 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.949003 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 15:12:16 +0000 UTC" deadline="2028-01-16 11:27:14.645426688 +0000 UTC" Apr 17 15:17:18.949057 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:18.949055 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15332h9m55.696375011s" Apr 17 15:17:19.057829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:19.057798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:19.058000 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:19.057922 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:20.058214 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:20.058177 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:20.058639 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:20.058317 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:20.553729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:20.553617 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:20.553900 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:20.553793 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:20.553900 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:20.553862 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:17:24.553839374 +0000 UTC m=+9.095504225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:20.654858 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:20.654281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:20.654858 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:20.654444 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:20.654858 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:20.654466 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:20.654858 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:20.654481 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zhg8b for pod openshift-network-diagnostics/network-check-target-xz6xx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:20.654858 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:20.654532 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b podName:2f57efa0-9b15-4e70-9d38-74a517201d53 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:24.654518698 +0000 UTC m=+9.196183542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhg8b" (UniqueName: "kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b") pod "network-check-target-xz6xx" (UID: "2f57efa0-9b15-4e70-9d38-74a517201d53") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:21.057477 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:21.057008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:21.057477 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:21.057145 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:22.057647 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:22.057602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:22.058117 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:22.057746 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:23.056970 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:23.056942 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:23.057165 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:23.057077 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:24.058620 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.058533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:24.059069 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.058687 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:24.343537 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.343460 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-blrs8"] Apr 17 15:17:24.347930 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.347731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.347930 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.347797 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:24.382310 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.382149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7db05466-6c79-496c-9e75-143b8a1a69d1-kubelet-config\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.382310 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.382196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.382310 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.382249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7db05466-6c79-496c-9e75-143b8a1a69d1-dbus\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.483413 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.483386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7db05466-6c79-496c-9e75-143b8a1a69d1-kubelet-config\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.483575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.483425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.483575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.483480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7db05466-6c79-496c-9e75-143b8a1a69d1-dbus\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.483575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.483517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7db05466-6c79-496c-9e75-143b8a1a69d1-kubelet-config\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.483705 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.483623 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:24.483705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.483629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7db05466-6c79-496c-9e75-143b8a1a69d1-dbus\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.483705 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.483680 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret podName:7db05466-6c79-496c-9e75-143b8a1a69d1 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:24.983662796 +0000 UTC m=+9.525327642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret") pod "global-pull-secret-syncer-blrs8" (UID: "7db05466-6c79-496c-9e75-143b8a1a69d1") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:24.584235 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.584199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:24.584425 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.584392 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:24.584547 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.584456 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:17:32.584436931 +0000 UTC m=+17.126101790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:24.685064 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.684971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:24.685198 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.685163 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:24.685198 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.685184 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:24.685198 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.685195 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zhg8b for pod openshift-network-diagnostics/network-check-target-xz6xx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:24.685490 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.685245 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b podName:2f57efa0-9b15-4e70-9d38-74a517201d53 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:32.685228224 +0000 UTC m=+17.226893075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhg8b" (UniqueName: "kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b") pod "network-check-target-xz6xx" (UID: "2f57efa0-9b15-4e70-9d38-74a517201d53") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:24.987389 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:24.987313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:24.987529 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.987506 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:24.987585 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:24.987561 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret podName:7db05466-6c79-496c-9e75-143b8a1a69d1 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:25.987545086 +0000 UTC m=+10.529209936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret") pod "global-pull-secret-syncer-blrs8" (UID: "7db05466-6c79-496c-9e75-143b8a1a69d1") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:25.057118 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:25.056958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:25.057118 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:25.057087 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:25.994138 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:25.994102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:25.994578 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:25.994222 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:25.994578 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:25.994281 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret podName:7db05466-6c79-496c-9e75-143b8a1a69d1 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:27.994263859 +0000 UTC m=+12.535928705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret") pod "global-pull-secret-syncer-blrs8" (UID: "7db05466-6c79-496c-9e75-143b8a1a69d1") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:26.058759 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:26.058726 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:26.058916 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:26.058892 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:26.059217 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:26.059200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:26.059325 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:26.059296 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:27.057656 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:27.057609 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:27.058135 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:27.057746 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:28.007624 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:28.007590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:28.007785 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:28.007689 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:28.007785 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:28.007743 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret podName:7db05466-6c79-496c-9e75-143b8a1a69d1 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:32.007726898 +0000 UTC m=+16.549391742 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret") pod "global-pull-secret-syncer-blrs8" (UID: "7db05466-6c79-496c-9e75-143b8a1a69d1") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:28.057432 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:28.057405 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:28.057534 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:28.057498 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:28.057617 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:28.057596 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:28.057756 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:28.057733 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:29.057859 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:29.057819 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:29.058283 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:29.057927 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:30.059777 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:30.059746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:30.059777 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:30.059772 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:30.060226 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:30.059853 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:30.060226 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:30.059975 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:31.056925 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:31.056853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:31.057076 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:31.056987 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:32.038482 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:32.038443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:32.038870 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.038606 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:32.038870 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.038676 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret podName:7db05466-6c79-496c-9e75-143b8a1a69d1 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:40.038660669 +0000 UTC m=+24.580325514 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret") pod "global-pull-secret-syncer-blrs8" (UID: "7db05466-6c79-496c-9e75-143b8a1a69d1") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:32.057052 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:32.057015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:32.057167 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:32.057059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:32.057210 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.057171 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:32.057330 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.057305 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:32.642643 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:32.642603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:32.642863 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.642737 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:32.642863 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.642812 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:17:48.642795313 +0000 UTC m=+33.184460161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:32.743826 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:32.743783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:32.743996 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.743952 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:32.743996 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.743975 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:32.743996 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.743986 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zhg8b for pod openshift-network-diagnostics/network-check-target-xz6xx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:32.744134 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:32.744065 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b podName:2f57efa0-9b15-4e70-9d38-74a517201d53 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:48.744030832 +0000 UTC m=+33.285695685 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhg8b" (UniqueName: "kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b") pod "network-check-target-xz6xx" (UID: "2f57efa0-9b15-4e70-9d38-74a517201d53") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:33.057908 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:33.057825 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:33.058367 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:33.057941 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:34.059786 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:34.059762 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:34.060192 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:34.059762 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:34.060192 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:34.059862 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:34.060192 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:34.059923 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:35.057463 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:35.057433 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:35.057748 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:35.057549 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:36.060642 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.060115 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:36.060642 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:36.060337 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:36.060642 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.060135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:36.060642 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:36.060618 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:36.286800 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.286430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" event={"ID":"7aa4aa82-d5f1-423a-b9ac-13669e2b1804","Type":"ContainerStarted","Data":"a0aa69ed39a0641b90a9ff7524450edcce5d117feadbd22d9cfd30bd7908bdfc"} Apr 17 15:17:36.293615 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.293593 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:17:36.293938 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.293910 2577 generic.go:358] "Generic (PLEG): container finished" podID="41a2e9d0-bfbe-47d5-9ccd-610cb5204675" containerID="0f64bacdc9f1216b81be7aa46b3b445c14b71aecb7e3e01b0f6958fc85dba768" exitCode=1 Apr 17 15:17:36.294003 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.293967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"79045f6ce4c31f218ca049d7e536cf53fb913badc74ca1bcb50e0f9fc0001599"} Apr 17 15:17:36.294003 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.293986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"e780f91b484958d16c1ea9d6e204cfec1410aa36c1c2ea0fcddebb30c1320b9f"} Apr 17 15:17:36.294003 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.293995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"507f7d9f11064228de57cbcd750b0c1c4a5b5481d19f4418c750f1b7b2814eb5"} Apr 17 15:17:36.294003 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.294003 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerDied","Data":"0f64bacdc9f1216b81be7aa46b3b445c14b71aecb7e3e01b0f6958fc85dba768"} Apr 17 15:17:36.294165 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.294012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"88f6caaef678d40305be596a865ee2239993e06cb050726f9eec1724311d099f"} Apr 17 15:17:36.300772 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.300220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j8tkm" event={"ID":"859c9d28-c95a-461d-841e-f476f3fb6fb7","Type":"ContainerStarted","Data":"210fdacac99ba160d51ddd1a2367b8b9b05333963f4c43d377b5c666a2e068ca"} Apr 17 15:17:36.303863 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.303071 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9v7gz" podStartSLOduration=1.968778687 podStartE2EDuration="20.303055545s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.342954769 +0000 UTC m=+1.884619613" lastFinishedPulling="2026-04-17 15:17:35.677231624 +0000 UTC m=+20.218896471" observedRunningTime="2026-04-17 15:17:36.302436605 +0000 UTC m=+20.844101505" watchObservedRunningTime="2026-04-17 15:17:36.303055545 +0000 UTC m=+20.844720412" Apr 17 15:17:36.305690 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.305668 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" event={"ID":"1e1f1a3ec57b3b66a90c747576fdf8e1","Type":"ContainerStarted","Data":"027b42c78129850e590a52943cd324b4e41868f849898d9b4e66eb94d919acd9"} Apr 17 15:17:36.319049 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.318981 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j8tkm" podStartSLOduration=1.6376702829999998 podStartE2EDuration="20.318962514s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.23420354 +0000 UTC m=+1.775868385" lastFinishedPulling="2026-04-17 15:17:35.915495758 +0000 UTC m=+20.457160616" observedRunningTime="2026-04-17 15:17:36.318238222 +0000 UTC m=+20.859903091" watchObservedRunningTime="2026-04-17 15:17:36.318962514 +0000 UTC m=+20.860627437" Apr 17 15:17:36.330243 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:36.330202 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-29.ec2.internal" podStartSLOduration=20.330189767 podStartE2EDuration="20.330189767s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:17:36.330174654 +0000 UTC m=+20.871839535" watchObservedRunningTime="2026-04-17 15:17:36.330189767 +0000 UTC m=+20.871854630" Apr 17 15:17:37.057439 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.057405 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:37.057593 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:37.057526 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:37.308545 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.308511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" event={"ID":"32385a04-2774-4f48-af2e-36e3bb20d368","Type":"ContainerStarted","Data":"4234213b918ba712abebccfd8f6b6eafa6d6a3c15fa6bfa7bcc1fc4fee0340ca"} Apr 17 15:17:37.309748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.309727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xc7q5" event={"ID":"a55a9866-dc90-424d-aefb-be85c6ce02cb","Type":"ContainerStarted","Data":"ada84ac6bde9323f1615870b665ecd9314294360cc6c82db8c85b2586faf990d"} Apr 17 15:17:37.311073 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.311050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cx2v6" event={"ID":"879a1087-ad81-4931-a7fc-1d30c4f2539d","Type":"ContainerStarted","Data":"4c6fbf5b5dd74e7e17cb76ad165a99ff0cde1e7e516d86c33341602df19394b6"} Apr 17 15:17:37.312313 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.312288 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ljhqf" event={"ID":"131fc208-4c4f-4581-b543-a9f317f71657","Type":"ContainerStarted","Data":"22f07c6926f7a67d3bd9e75d117438818bfe4d73d7dfd2ecb258793cda60c9be"} Apr 17 15:17:37.313599 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.313573 2577 generic.go:358] "Generic (PLEG): container finished" podID="15b0834749502abab65480e68ca33cd4" containerID="0b9ce3a18bcf6ad8b7534700cd31de4718a54a84852e4d343f0f36d7672584c8" exitCode=0 Apr 17 15:17:37.313704 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.313613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" event={"ID":"15b0834749502abab65480e68ca33cd4","Type":"ContainerDied","Data":"0b9ce3a18bcf6ad8b7534700cd31de4718a54a84852e4d343f0f36d7672584c8"} Apr 17 15:17:37.315019 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.314999 2577 generic.go:358] "Generic (PLEG): container finished" podID="14d69252-8d9f-46ec-8e05-0b0a8f1b3b07" containerID="8fccc8daa2f58701b8c276e7bf4229a241482df2269457befd9956676a5af9b8" exitCode=0 Apr 17 15:17:37.315110 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.315083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerDied","Data":"8fccc8daa2f58701b8c276e7bf4229a241482df2269457befd9956676a5af9b8"} Apr 17 15:17:37.316363 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.316345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8hx9p" event={"ID":"14c11c01-24f2-4908-a3cb-5c90f9ec8d35","Type":"ContainerStarted","Data":"ad8ce5771cfbfdb4fa84c508a7c48856981bdff536aa665693706326a8306dac"} Apr 17 15:17:37.318961 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.318945 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:17:37.319284 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.319265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"d1e9c6c576b0e76b558b306331d62ffea10826019d731165072a6f47a990da84"} Apr 17 15:17:37.323855 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.323819 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xc7q5" podStartSLOduration=2.940385514 podStartE2EDuration="21.323807699s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.293806233 +0000 UTC m=+1.835471077" lastFinishedPulling="2026-04-17 15:17:35.677228417 +0000 UTC m=+20.218893262" observedRunningTime="2026-04-17 15:17:37.323382015 +0000 UTC m=+21.865046881" watchObservedRunningTime="2026-04-17 15:17:37.323807699 +0000 UTC m=+21.865472566" Apr 17 15:17:37.353292 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.353249 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cx2v6" podStartSLOduration=2.888115699 podStartE2EDuration="21.353236383s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.278301661 +0000 UTC m=+1.819966505" lastFinishedPulling="2026-04-17 15:17:35.74342233 +0000 UTC m=+20.285087189" observedRunningTime="2026-04-17 15:17:37.352999635 +0000 UTC m=+21.894664502" watchObservedRunningTime="2026-04-17 15:17:37.353236383 +0000 UTC m=+21.894901247" Apr 17 15:17:37.368404 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.368351 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ljhqf" podStartSLOduration=2.942863423 podStartE2EDuration="21.36833501s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.251759056 +0000 UTC m=+1.793423904" lastFinishedPulling="2026-04-17 15:17:35.677230645 +0000 UTC m=+20.218895491" observedRunningTime="2026-04-17 15:17:37.367556683 +0000 UTC m=+21.909221550" watchObservedRunningTime="2026-04-17 15:17:37.36833501 +0000 UTC m=+21.909999878" Apr 17 15:17:37.404715 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.404675 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8hx9p" podStartSLOduration=2.996652649 podStartE2EDuration="21.404659145s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.366511028 +0000 UTC m=+1.908175873" lastFinishedPulling="2026-04-17 15:17:35.774517518 +0000 UTC m=+20.316182369" observedRunningTime="2026-04-17 15:17:37.381155473 +0000 UTC m=+21.922820339" watchObservedRunningTime="2026-04-17 15:17:37.404659145 +0000 UTC m=+21.946324011" Apr 17 15:17:37.452002 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.451854 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 15:17:37.976800 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.976697 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T15:17:37.451998701Z","UUID":"60a7d6e2-d82e-4607-a490-360d1130b093","Handler":null,"Name":"","Endpoint":""} Apr 17 15:17:37.978869 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.978843 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 15:17:37.978869 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:37.978876 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 15:17:38.057174 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:38.057142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:38.057335 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:38.057182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:38.057335 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:38.057261 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:38.057454 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:38.057413 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:38.326127 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:38.326073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" event={"ID":"32385a04-2774-4f48-af2e-36e3bb20d368","Type":"ContainerStarted","Data":"0e67134dd20f6a3cfe2ea62acf06d98170ad00d706d9758b12812efc9c74ea97"} Apr 17 15:17:38.328028 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:38.327975 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" event={"ID":"15b0834749502abab65480e68ca33cd4","Type":"ContainerStarted","Data":"dbbe189f90bafb8ffe88147f8606d3cb6a5afb0abedc2815656cffc11446faf8"} Apr 17 15:17:38.341943 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:38.341892 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-29.ec2.internal" podStartSLOduration=22.341874079 podStartE2EDuration="22.341874079s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:17:38.341225801 +0000 UTC m=+22.882890668" watchObservedRunningTime="2026-04-17 15:17:38.341874079 +0000 UTC m=+22.883538948" Apr 17 15:17:39.057835 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:39.057804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:39.058030 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:39.057924 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:39.333169 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:39.333142 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:17:39.333665 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:39.333486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"248fdbf621d7b42d45492ac5891aab740cd82c89d491b74e936f5d6229a30376"} Apr 17 15:17:39.335421 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:39.335392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" event={"ID":"32385a04-2774-4f48-af2e-36e3bb20d368","Type":"ContainerStarted","Data":"5d5ab57bbdb8afcc6bfac98bb63d96a2705a0cef7759b6bd9f7aa247ff8be84c"} Apr 17 15:17:39.350540 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:39.350497 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4x45m" podStartSLOduration=2.274617143 podStartE2EDuration="23.350487329s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.320670443 +0000 UTC m=+1.862335293" lastFinishedPulling="2026-04-17 15:17:38.396540619 +0000 UTC m=+22.938205479" observedRunningTime="2026-04-17 15:17:39.35035642 +0000 UTC m=+23.892021287" watchObservedRunningTime="2026-04-17 15:17:39.350487329 +0000 UTC m=+23.892152215" Apr 17 15:17:40.057132 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:40.057096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:40.057317 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:40.057162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:40.057317 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:40.057276 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:40.057473 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:40.057427 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:40.105111 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:40.105083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:40.105258 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:40.105223 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:40.105300 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:40.105285 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret podName:7db05466-6c79-496c-9e75-143b8a1a69d1 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:56.105266504 +0000 UTC m=+40.646931350 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret") pod "global-pull-secret-syncer-blrs8" (UID: "7db05466-6c79-496c-9e75-143b8a1a69d1") : object "kube-system"/"original-pull-secret" not registered Apr 17 15:17:41.057825 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.057609 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:41.058611 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:41.057925 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:41.343128 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.343094 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:17:41.343484 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.343458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"13bd051116e57cca2b84099c7b904dfbfed4f26d20fe32fe2aeeacac1a12a944"} Apr 17 15:17:41.343821 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.343802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:41.343899 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.343832 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:41.344153 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.344109 2577 scope.go:117] "RemoveContainer" containerID="0f64bacdc9f1216b81be7aa46b3b445c14b71aecb7e3e01b0f6958fc85dba768" Apr 17 15:17:41.361000 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.360971 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:41.578822 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.578772 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:41.579418 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:41.579389 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:42.057342 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.057311 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:42.057511 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:42.057480 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:42.057893 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.057875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:42.058231 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:42.057993 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:42.345748 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.345722 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:42.346446 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.346427 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ljhqf" Apr 17 15:17:42.541065 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.540889 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j7zl6"] Apr 17 15:17:42.541180 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.541144 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:42.541245 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:42.541229 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:42.541481 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.541460 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-blrs8"] Apr 17 15:17:42.541561 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.541547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:42.541672 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:42.541655 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:42.556281 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.556258 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xz6xx"] Apr 17 15:17:42.556390 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:42.556362 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:42.556461 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:42.556439 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:43.348835 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:43.348804 2577 generic.go:358] "Generic (PLEG): container finished" podID="14d69252-8d9f-46ec-8e05-0b0a8f1b3b07" containerID="43a5a51da7f720bea66cc7b21e3c2650f83de0c4698126f9b2b258428c9b3c8a" exitCode=0 Apr 17 15:17:43.349293 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:43.348876 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerDied","Data":"43a5a51da7f720bea66cc7b21e3c2650f83de0c4698126f9b2b258428c9b3c8a"} Apr 17 15:17:43.352077 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:43.352062 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:17:43.352450 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:43.352426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" event={"ID":"41a2e9d0-bfbe-47d5-9ccd-610cb5204675","Type":"ContainerStarted","Data":"64be561422d130ebdffccbf1e6066caff4086a6230eead505d65bede34644b18"} Apr 17 15:17:43.352640 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:43.352625 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:43.367313 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:43.367283 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:17:43.395762 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:43.395726 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" podStartSLOduration=8.919232482 podStartE2EDuration="27.395715242s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.305547919 +0000 UTC m=+1.847212764" lastFinishedPulling="2026-04-17 15:17:35.782030678 +0000 UTC m=+20.323695524" observedRunningTime="2026-04-17 15:17:43.395320077 +0000 UTC m=+27.936984944" watchObservedRunningTime="2026-04-17 15:17:43.395715242 +0000 UTC m=+27.937380108" Apr 17 15:17:44.057420 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:44.057386 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:44.057420 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:44.057413 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:44.057662 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:44.057419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:44.057662 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:44.057515 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:44.057662 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:44.057585 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:44.057774 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:44.057701 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:45.360171 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:45.360136 2577 generic.go:358] "Generic (PLEG): container finished" podID="14d69252-8d9f-46ec-8e05-0b0a8f1b3b07" containerID="aa0b694508c1f2f9b3df5440217a010323a04f618370507df1fba50b573ae4dc" exitCode=0 Apr 17 15:17:45.360854 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:45.360222 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerDied","Data":"aa0b694508c1f2f9b3df5440217a010323a04f618370507df1fba50b573ae4dc"} Apr 17 15:17:45.371301 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:45.371262 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" podUID="41a2e9d0-bfbe-47d5-9ccd-610cb5204675" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 15:17:46.060774 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:46.060753 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:46.060929 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:46.060753 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:46.060929 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:46.060846 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:46.060929 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:46.060906 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:46.060929 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:46.060753 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:46.061075 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:46.060998 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:46.364502 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:46.364475 2577 generic.go:358] "Generic (PLEG): container finished" podID="14d69252-8d9f-46ec-8e05-0b0a8f1b3b07" containerID="f370a4a30a827d57e25b7c093f181e39f45ed5e26bc65ecb65cd9137c2de78ea" exitCode=0 Apr 17 15:17:46.364954 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:46.364528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerDied","Data":"f370a4a30a827d57e25b7c093f181e39f45ed5e26bc65ecb65cd9137c2de78ea"} Apr 17 15:17:48.060484 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.060452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:48.060985 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.060574 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:48.060985 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.060584 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xz6xx" podUID="2f57efa0-9b15-4e70-9d38-74a517201d53" Apr 17 15:17:48.060985 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.060596 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:48.060985 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.060677 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:17:48.060985 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.060737 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrs8" podUID="7db05466-6c79-496c-9e75-143b8a1a69d1" Apr 17 15:17:48.675992 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.675953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:48.676169 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.676123 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:48.676228 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.676190 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:18:20.676174081 +0000 UTC m=+65.217838926 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:48.776914 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.776875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:48.777112 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.777009 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:48.777112 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.777027 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:48.777112 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.777061 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zhg8b for pod openshift-network-diagnostics/network-check-target-xz6xx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:48.777112 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:48.777114 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b podName:2f57efa0-9b15-4e70-9d38-74a517201d53 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:20.777100527 +0000 UTC m=+65.318765372 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhg8b" (UniqueName: "kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b") pod "network-check-target-xz6xx" (UID: "2f57efa0-9b15-4e70-9d38-74a517201d53") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:48.810189 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.810130 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-29.ec2.internal" event="NodeReady" Apr 17 15:17:48.810313 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.810244 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 15:17:48.843822 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.843799 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-997f69ccf-rnb69"] Apr 17 15:17:48.845822 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.845804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.848021 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.848002 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 15:17:48.848139 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.848021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 15:17:48.848200 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.848148 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 15:17:48.848273 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.848254 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5jck6\"" Apr 17 15:17:48.854156 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.854113 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 15:17:48.854378 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.854358 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dl2zj"] Apr 17 15:17:48.855919 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.855899 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2vpk9"] Apr 17 15:17:48.856227 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.856188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:48.857790 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.857773 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:48.858392 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.858372 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 15:17:48.858471 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.858406 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 15:17:48.858655 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.858637 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 15:17:48.858730 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.858685 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wwl79\"" Apr 17 15:17:48.859628 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.859600 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 15:17:48.859821 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.859806 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8skx\"" Apr 17 15:17:48.860241 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.860074 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 15:17:48.860904 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.860886 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-997f69ccf-rnb69"] Apr 17 15:17:48.867156 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.866499 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2vpk9"] Apr 17 15:17:48.873392 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.873371 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dl2zj"] Apr 17 15:17:48.978533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.978665 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-certificates\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.978665 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-trusted-ca\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.978665 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4f1d2ee5-9f9b-4086-afee-0e043df76f02-tmp-dir\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:48.978665 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:48.978837 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-image-registry-private-configuration\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.978837 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbrw\" (UniqueName: \"kubernetes.io/projected/4f1d2ee5-9f9b-4086-afee-0e043df76f02-kube-api-access-xbbrw\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:48.978837 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-ca-trust-extracted\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.978986 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d6jc\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-kube-api-access-5d6jc\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.978986 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f1d2ee5-9f9b-4086-afee-0e043df76f02-config-volume\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:48.978986 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-installation-pull-secrets\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.978986 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:48.978986 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.978982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-bound-sa-token\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:48.979242 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:48.979071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4wgx\" (UniqueName: \"kubernetes.io/projected/05352b16-4fb2-4f4e-894f-d69b17f92924-kube-api-access-m4wgx\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:49.080185 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-installation-pull-secrets\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-bound-sa-token\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4wgx\" (UniqueName: \"kubernetes.io/projected/05352b16-4fb2-4f4e-894f-d69b17f92924-kube-api-access-m4wgx\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-certificates\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-trusted-ca\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.080388 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.080415 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.080433 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.080448 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:49.58042936 +0000 UTC m=+34.122094204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.080489 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:49.580471712 +0000 UTC m=+34.122136563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080393 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4f1d2ee5-9f9b-4086-afee-0e043df76f02-tmp-dir\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.080559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-image-registry-private-configuration\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbrw\" (UniqueName: \"kubernetes.io/projected/4f1d2ee5-9f9b-4086-afee-0e043df76f02-kube-api-access-xbbrw\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-ca-trust-extracted\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4f1d2ee5-9f9b-4086-afee-0e043df76f02-tmp-dir\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d6jc\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-kube-api-access-5d6jc\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.080735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f1d2ee5-9f9b-4086-afee-0e043df76f02-config-volume\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.081056 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.081067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-certificates\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.081166 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.081108 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:49.581091563 +0000 UTC m=+34.122756410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:17:49.081523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.081233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f1d2ee5-9f9b-4086-afee-0e043df76f02-config-volume\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.081523 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.081282 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-ca-trust-extracted\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.082123 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.082074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-trusted-ca\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.084787 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.084766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-installation-pull-secrets\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.084877 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.084772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-image-registry-private-configuration\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.090441 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.090228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d6jc\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-kube-api-access-5d6jc\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.090595 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.090478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4wgx\" (UniqueName: \"kubernetes.io/projected/05352b16-4fb2-4f4e-894f-d69b17f92924-kube-api-access-m4wgx\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:49.090595 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.090521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbrw\" (UniqueName: \"kubernetes.io/projected/4f1d2ee5-9f9b-4086-afee-0e043df76f02-kube-api-access-xbbrw\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.091923 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.091903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-bound-sa-token\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.584863 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.584826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:49.585072 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.584920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:49.585072 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:49.584954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:49.585072 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.584998 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:49.585394 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.585084 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:49.585394 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.585089 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:49.585394 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.585107 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:17:49.585394 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.585094 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:50.585076844 +0000 UTC m=+35.126741688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:17:49.585394 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.585155 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:50.585139876 +0000 UTC m=+35.126804728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:17:49.585394 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:49.585175 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:50.58516615 +0000 UTC m=+35.126831000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:17:50.061680 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.061587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:50.061847 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.061728 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:17:50.061847 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.061821 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:17:50.064246 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.064228 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 15:17:50.065196 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.065176 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 15:17:50.065302 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.065250 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sx2cz\"" Apr 17 15:17:50.065302 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.065275 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9m884\"" Apr 17 15:17:50.065421 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.065176 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 15:17:50.065547 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.065531 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 15:17:50.593200 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.593166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.593251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:50.593302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:50.593327 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:50.593406 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:52.593389488 +0000 UTC m=+37.135054336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:50.593411 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:50.593411 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:50.593464 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:50.593472 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:52.59345381 +0000 UTC m=+37.135118656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:17:50.593627 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:50.593507 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:52.593491418 +0000 UTC m=+37.135156265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:17:52.378126 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:52.378025 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerStarted","Data":"28fca5593ca27b1197cd91893f7685f366c81cc2dcfc0537c6c3d3bc17a1633c"} Apr 17 15:17:52.609356 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:52.609321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:52.609456 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:52.609411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:52.609456 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:52.609440 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:52.609456 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:52.609452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:52.609569 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:52.609509 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:56.609490172 +0000 UTC m=+41.151155032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:17:52.609569 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:52.609531 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:52.609569 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:52.609545 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:17:52.609721 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:52.609589 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:56.609573274 +0000 UTC m=+41.151238133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:17:52.609721 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:52.609612 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:52.609721 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:52.609680 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:56.609664901 +0000 UTC m=+41.151329759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:17:53.382905 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:53.382720 2577 generic.go:358] "Generic (PLEG): container finished" podID="14d69252-8d9f-46ec-8e05-0b0a8f1b3b07" containerID="28fca5593ca27b1197cd91893f7685f366c81cc2dcfc0537c6c3d3bc17a1633c" exitCode=0 Apr 17 15:17:53.383316 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:53.382804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerDied","Data":"28fca5593ca27b1197cd91893f7685f366c81cc2dcfc0537c6c3d3bc17a1633c"} Apr 17 15:17:54.387831 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:54.387791 2577 generic.go:358] "Generic (PLEG): container finished" podID="14d69252-8d9f-46ec-8e05-0b0a8f1b3b07" containerID="3427a5e2df9a9d7a0fce867f34e828a56b90a22aac1e98047814ef62c82cde35" exitCode=0 Apr 17 15:17:54.388290 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:54.387838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerDied","Data":"3427a5e2df9a9d7a0fce867f34e828a56b90a22aac1e98047814ef62c82cde35"} Apr 17 15:17:55.393352 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:55.393315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" event={"ID":"14d69252-8d9f-46ec-8e05-0b0a8f1b3b07","Type":"ContainerStarted","Data":"baaa0d1097d004306fd7efd8dc6fb7aa0a902bc2774174625b85c8960c5b5ddd"} Apr 17 15:17:55.413488 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:55.413433 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7jdw8" podStartSLOduration=4.599240713 podStartE2EDuration="39.413414236s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:17:17.370661295 +0000 UTC m=+1.912326141" lastFinishedPulling="2026-04-17 15:17:52.184834804 +0000 UTC m=+36.726499664" observedRunningTime="2026-04-17 15:17:55.412723024 +0000 UTC m=+39.954387911" watchObservedRunningTime="2026-04-17 15:17:55.413414236 +0000 UTC m=+39.955079104" Apr 17 15:17:56.133071 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:56.133028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:56.136484 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:56.136462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7db05466-6c79-496c-9e75-143b8a1a69d1-original-pull-secret\") pod \"global-pull-secret-syncer-blrs8\" (UID: \"7db05466-6c79-496c-9e75-143b8a1a69d1\") " pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:56.374527 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:56.374491 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrs8" Apr 17 15:17:56.528261 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:56.528232 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-blrs8"] Apr 17 15:17:56.531291 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:17:56.531259 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db05466_6c79_496c_9e75_143b8a1a69d1.slice/crio-eb00b3a050cf5e0089bf01b9d939ea4deb922027b6a67557e1d59abe686e871d WatchSource:0}: Error finding container eb00b3a050cf5e0089bf01b9d939ea4deb922027b6a67557e1d59abe686e871d: Status 404 returned error can't find the container with id eb00b3a050cf5e0089bf01b9d939ea4deb922027b6a67557e1d59abe686e871d Apr 17 15:17:56.636583 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:56.636559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:17:56.636697 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:56.636624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:17:56.636765 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:56.636688 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:56.636765 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:56.636718 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:17:56.636765 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:56.636732 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:17:56.636765 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:56.636751 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:04.6367358 +0000 UTC m=+49.178400645 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:17:56.636933 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:56.636790 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:04.63677325 +0000 UTC m=+49.178438096 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:17:56.636933 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:56.636785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:17:56.636933 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:56.636904 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:56.637079 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:17:56.636947 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:04.636936865 +0000 UTC m=+49.178601710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:17:57.397704 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:17:57.397665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-blrs8" event={"ID":"7db05466-6c79-496c-9e75-143b8a1a69d1","Type":"ContainerStarted","Data":"eb00b3a050cf5e0089bf01b9d939ea4deb922027b6a67557e1d59abe686e871d"} Apr 17 15:18:01.407107 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:01.407075 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-blrs8" event={"ID":"7db05466-6c79-496c-9e75-143b8a1a69d1","Type":"ContainerStarted","Data":"29b3ac9687e34e4e16f3b5bb0c22d86d8a25e4396ff3020f9f24e4793bcd3b75"} Apr 17 15:18:01.421573 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:01.421531 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-blrs8" podStartSLOduration=33.254867888 podStartE2EDuration="37.421517285s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:56.532965169 +0000 UTC m=+41.074630022" lastFinishedPulling="2026-04-17 15:18:00.699614574 +0000 UTC m=+45.241279419" observedRunningTime="2026-04-17 15:18:01.420935469 +0000 UTC m=+45.962600332" watchObservedRunningTime="2026-04-17 15:18:01.421517285 +0000 UTC m=+45.963182152" Apr 17 15:18:04.687457 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:04.687417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:04.687481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:04.687507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:04.687572 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:04.687592 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:04.687638 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:20.687624897 +0000 UTC m=+65.229289742 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:04.687653 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:20.68764633 +0000 UTC m=+65.229311176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:04.687654 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:04.687670 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:18:04.687847 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:04.687712 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:20.687699861 +0000 UTC m=+65.229364707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:18:15.173921 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.173885 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft"] Apr 17 15:18:15.178381 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.178357 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.181572 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.181546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 15:18:15.181572 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.181561 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 15:18:15.181723 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.181546 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 15:18:15.181723 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.181561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 15:18:15.186844 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.186819 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft"] Apr 17 15:18:15.259660 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.259634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cb75588-7310-422d-9696-a7cd0cf3e446-tmp\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.259784 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.259669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3cb75588-7310-422d-9696-a7cd0cf3e446-klusterlet-config\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.259784 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.259711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxzh\" (UniqueName: \"kubernetes.io/projected/3cb75588-7310-422d-9696-a7cd0cf3e446-kube-api-access-wnxzh\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.360564 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.360532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cb75588-7310-422d-9696-a7cd0cf3e446-tmp\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.360564 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.360567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3cb75588-7310-422d-9696-a7cd0cf3e446-klusterlet-config\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.360756 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.360609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxzh\" (UniqueName: \"kubernetes.io/projected/3cb75588-7310-422d-9696-a7cd0cf3e446-kube-api-access-wnxzh\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.360950 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.360919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cb75588-7310-422d-9696-a7cd0cf3e446-tmp\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.363342 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.363310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3cb75588-7310-422d-9696-a7cd0cf3e446-klusterlet-config\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.368477 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.368452 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxzh\" (UniqueName: \"kubernetes.io/projected/3cb75588-7310-422d-9696-a7cd0cf3e446-kube-api-access-wnxzh\") pod \"klusterlet-addon-workmgr-676685669b-lpqft\" (UID: \"3cb75588-7310-422d-9696-a7cd0cf3e446\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.371183 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.371161 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6m42k" Apr 17 15:18:15.488904 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.488848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:15.600867 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:15.600836 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft"] Apr 17 15:18:15.604392 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:18:15.604362 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb75588_7310_422d_9696_a7cd0cf3e446.slice/crio-099d4318bc0a59d2dd8f35668e8ffc7b03e0bcbe937eec28685ec62f0e81dbd3 WatchSource:0}: Error finding container 099d4318bc0a59d2dd8f35668e8ffc7b03e0bcbe937eec28685ec62f0e81dbd3: Status 404 returned error can't find the container with id 099d4318bc0a59d2dd8f35668e8ffc7b03e0bcbe937eec28685ec62f0e81dbd3 Apr 17 15:18:16.438495 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:16.438459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" event={"ID":"3cb75588-7310-422d-9696-a7cd0cf3e446","Type":"ContainerStarted","Data":"099d4318bc0a59d2dd8f35668e8ffc7b03e0bcbe937eec28685ec62f0e81dbd3"} Apr 17 15:18:20.698181 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.698143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.698196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.698313 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.698341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.698398 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:52.698375981 +0000 UTC m=+97.240040833 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.698423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.698471 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.698489 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.698523 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.698547 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:52.698530661 +0000 UTC m=+97.240195506 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:18:20.698696 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.698564 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:52.698555288 +0000 UTC m=+97.240220136 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:18:20.703549 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.703528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 15:18:20.708664 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.708645 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 15:18:20.708784 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:20.708707 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:19:24.70869217 +0000 UTC m=+129.250357015 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : secret "metrics-daemon-secret" not found Apr 17 15:18:20.798947 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.798915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:18:20.801211 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.801191 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 15:18:20.812087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.812068 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 15:18:20.823877 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.823852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhg8b\" (UniqueName: \"kubernetes.io/projected/2f57efa0-9b15-4e70-9d38-74a517201d53-kube-api-access-zhg8b\") pod \"network-check-target-xz6xx\" (UID: \"2f57efa0-9b15-4e70-9d38-74a517201d53\") " pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:18:20.982970 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.982889 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sx2cz\"" Apr 17 15:18:20.990901 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:20.990882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:18:21.582218 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:21.582195 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xz6xx"] Apr 17 15:18:21.590893 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:18:21.590869 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f57efa0_9b15_4e70_9d38_74a517201d53.slice/crio-fdb1daba6bd2a506917e401af35925e40e6f977e241595f6958a84396070f2c4 WatchSource:0}: Error finding container fdb1daba6bd2a506917e401af35925e40e6f977e241595f6958a84396070f2c4: Status 404 returned error can't find the container with id fdb1daba6bd2a506917e401af35925e40e6f977e241595f6958a84396070f2c4 Apr 17 15:18:22.452171 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:22.452131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xz6xx" event={"ID":"2f57efa0-9b15-4e70-9d38-74a517201d53","Type":"ContainerStarted","Data":"fdb1daba6bd2a506917e401af35925e40e6f977e241595f6958a84396070f2c4"} Apr 17 15:18:22.453520 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:22.453487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" event={"ID":"3cb75588-7310-422d-9696-a7cd0cf3e446","Type":"ContainerStarted","Data":"2d79b3dc8ae0f2c9dfa2a9b4321f0e4e1c52bc8f55ac69e8428ed265a944133d"} Apr 17 15:18:22.453736 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:22.453718 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:22.455611 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:22.455589 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" Apr 17 15:18:22.469013 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:22.468967 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-676685669b-lpqft" podStartSLOduration=1.5371719050000001 podStartE2EDuration="7.468951901s" podCreationTimestamp="2026-04-17 15:18:15 +0000 UTC" firstStartedPulling="2026-04-17 15:18:15.606163319 +0000 UTC m=+60.147828168" lastFinishedPulling="2026-04-17 15:18:21.537943317 +0000 UTC m=+66.079608164" observedRunningTime="2026-04-17 15:18:22.467022923 +0000 UTC m=+67.008687789" watchObservedRunningTime="2026-04-17 15:18:22.468951901 +0000 UTC m=+67.010616767" Apr 17 15:18:24.459526 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:24.459455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xz6xx" event={"ID":"2f57efa0-9b15-4e70-9d38-74a517201d53","Type":"ContainerStarted","Data":"be70827a414024cb1fda7c959037f6fd2ff3e2e37534baf24b1f095905c3289b"} Apr 17 15:18:24.459810 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:24.459624 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:18:24.473789 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:24.473739 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xz6xx" podStartSLOduration=65.886606819 podStartE2EDuration="1m8.473727043s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:18:21.592753717 +0000 UTC m=+66.134418562" lastFinishedPulling="2026-04-17 15:18:24.179873931 +0000 UTC m=+68.721538786" observedRunningTime="2026-04-17 15:18:24.472383389 +0000 UTC m=+69.014048294" watchObservedRunningTime="2026-04-17 15:18:24.473727043 +0000 UTC m=+69.015391910" Apr 17 15:18:52.715199 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:52.715166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:52.715219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") pod \"image-registry-997f69ccf-rnb69\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:52.715252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:52.715342 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:52.715360 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:52.715389 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:52.715409 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-997f69ccf-rnb69: secret "image-registry-tls" not found Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:52.715425 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls podName:4f1d2ee5-9f9b-4086-afee-0e043df76f02 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:56.715408862 +0000 UTC m=+161.257073709 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls") pod "dns-default-2vpk9" (UID: "4f1d2ee5-9f9b-4086-afee-0e043df76f02") : secret "dns-default-metrics-tls" not found Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:52.715441 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert podName:05352b16-4fb2-4f4e-894f-d69b17f92924 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:56.715435595 +0000 UTC m=+161.257100440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert") pod "ingress-canary-dl2zj" (UID: "05352b16-4fb2-4f4e-894f-d69b17f92924") : secret "canary-serving-cert" not found Apr 17 15:18:52.715623 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:18:52.715452 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls podName:74f902bb-0a1c-46ed-bb9a-32e7a254c7b6 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:56.715446386 +0000 UTC m=+161.257111230 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls") pod "image-registry-997f69ccf-rnb69" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6") : secret "image-registry-tls" not found Apr 17 15:18:55.465091 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:18:55.465060 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xz6xx" Apr 17 15:19:14.368020 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:14.367989 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8hx9p_14c11c01-24f2-4908-a3cb-5c90f9ec8d35/dns-node-resolver/0.log" Apr 17 15:19:15.567400 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:15.567373 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cx2v6_879a1087-ad81-4931-a7fc-1d30c4f2539d/node-ca/0.log" Apr 17 15:19:17.164947 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.164902 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs"] Apr 17 15:19:17.167814 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.167786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.168881 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.168850 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fdxgz"] Apr 17 15:19:17.170019 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.169998 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 15:19:17.170187 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.170164 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 15:19:17.170527 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.170487 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 15:19:17.170646 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.170553 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:19:17.170646 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.170620 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-vkdjl\"" Apr 17 15:19:17.171918 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.171897 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb"] Apr 17 15:19:17.172095 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.172076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.174145 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.174121 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6lwwd\"" Apr 17 15:19:17.174273 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.174147 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 15:19:17.174643 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.174609 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:19:17.174788 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.174769 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:17.174847 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.174785 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 15:19:17.175224 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.175207 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 15:19:17.176946 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.176925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 15:19:17.178337 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.178309 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:19:17.178446 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.178388 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zpsxd\"" Apr 17 15:19:17.178641 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.178611 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 15:19:17.179598 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.179574 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs"] Apr 17 15:19:17.181887 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.181862 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 15:19:17.182390 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.182368 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb"] Apr 17 15:19:17.191308 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.191279 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fdxgz"] Apr 17 15:19:17.270957 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.270916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85072088-8af2-4219-80f7-6a18460c13cf-config\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.270957 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.270944 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-serving-cert\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.271131 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.270971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkv4j\" (UniqueName: \"kubernetes.io/projected/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-kube-api-access-dkv4j\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:17.271131 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.270987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-config\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.271131 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.271001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g99q\" (UniqueName: \"kubernetes.io/projected/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-kube-api-access-7g99q\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.271131 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.271024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxbtn\" (UniqueName: \"kubernetes.io/projected/85072088-8af2-4219-80f7-6a18460c13cf-kube-api-access-gxbtn\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.271131 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.271073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85072088-8af2-4219-80f7-6a18460c13cf-trusted-ca\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.271131 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.271117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85072088-8af2-4219-80f7-6a18460c13cf-serving-cert\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.271339 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.271164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:17.371920 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.371891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:17.372013 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.371954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85072088-8af2-4219-80f7-6a18460c13cf-config\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.372013 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.371982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-serving-cert\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.372013 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkv4j\" (UniqueName: \"kubernetes.io/projected/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-kube-api-access-dkv4j\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:17.372176 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-config\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.372176 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:17.372059 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 15:19:17.372176 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g99q\" (UniqueName: \"kubernetes.io/projected/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-kube-api-access-7g99q\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.372176 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxbtn\" (UniqueName: \"kubernetes.io/projected/85072088-8af2-4219-80f7-6a18460c13cf-kube-api-access-gxbtn\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.372176 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85072088-8af2-4219-80f7-6a18460c13cf-trusted-ca\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.372176 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:17.372156 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls podName:7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:17.872140323 +0000 UTC m=+122.413805167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6xtb" (UID: "7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25") : secret "samples-operator-tls" not found Apr 17 15:19:17.372463 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85072088-8af2-4219-80f7-6a18460c13cf-serving-cert\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.372834 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-config\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.372927 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85072088-8af2-4219-80f7-6a18460c13cf-config\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.373013 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.372990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85072088-8af2-4219-80f7-6a18460c13cf-trusted-ca\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.374720 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.374691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-serving-cert\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.374936 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.374917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85072088-8af2-4219-80f7-6a18460c13cf-serving-cert\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.382449 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.382425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g99q\" (UniqueName: \"kubernetes.io/projected/1b6f9506-66fe-4d0b-b26e-7d16fbe56762-kube-api-access-7g99q\") pod \"service-ca-operator-d6fc45fc5-brnqs\" (UID: \"1b6f9506-66fe-4d0b-b26e-7d16fbe56762\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.382739 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.382718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxbtn\" (UniqueName: \"kubernetes.io/projected/85072088-8af2-4219-80f7-6a18460c13cf-kube-api-access-gxbtn\") pod \"console-operator-9d4b6777b-fdxgz\" (UID: \"85072088-8af2-4219-80f7-6a18460c13cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.382935 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.382909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkv4j\" (UniqueName: \"kubernetes.io/projected/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-kube-api-access-dkv4j\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:17.482471 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.482410 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" Apr 17 15:19:17.490430 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.490392 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:17.611090 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.610920 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs"] Apr 17 15:19:17.613448 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:17.613421 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6f9506_66fe_4d0b_b26e_7d16fbe56762.slice/crio-859642c48f9e90d15d464febaf86351d8786f706cde8173823693bd903951b68 WatchSource:0}: Error finding container 859642c48f9e90d15d464febaf86351d8786f706cde8173823693bd903951b68: Status 404 returned error can't find the container with id 859642c48f9e90d15d464febaf86351d8786f706cde8173823693bd903951b68 Apr 17 15:19:17.630507 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.630485 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fdxgz"] Apr 17 15:19:17.633138 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:17.633121 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85072088_8af2_4219_80f7_6a18460c13cf.slice/crio-615e8cf6554bb6f534cf630a9a9b3092ac1aef351d44fbd48605e33ccfc58a56 WatchSource:0}: Error finding container 615e8cf6554bb6f534cf630a9a9b3092ac1aef351d44fbd48605e33ccfc58a56: Status 404 returned error can't find the container with id 615e8cf6554bb6f534cf630a9a9b3092ac1aef351d44fbd48605e33ccfc58a56 Apr 17 15:19:17.877460 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:17.877427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:17.877611 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:17.877590 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 15:19:17.877667 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:17.877656 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls podName:7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:18.877639968 +0000 UTC m=+123.419304818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6xtb" (UID: "7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25") : secret "samples-operator-tls" not found Apr 17 15:19:18.562382 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:18.562348 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" event={"ID":"1b6f9506-66fe-4d0b-b26e-7d16fbe56762","Type":"ContainerStarted","Data":"859642c48f9e90d15d464febaf86351d8786f706cde8173823693bd903951b68"} Apr 17 15:19:18.563577 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:18.563545 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" event={"ID":"85072088-8af2-4219-80f7-6a18460c13cf","Type":"ContainerStarted","Data":"615e8cf6554bb6f534cf630a9a9b3092ac1aef351d44fbd48605e33ccfc58a56"} Apr 17 15:19:18.885026 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:18.884988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:18.885234 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:18.885213 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 15:19:18.885302 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:18.885290 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls podName:7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:20.885272053 +0000 UTC m=+125.426936902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6xtb" (UID: "7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25") : secret "samples-operator-tls" not found Apr 17 15:19:20.569853 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:20.569792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" event={"ID":"1b6f9506-66fe-4d0b-b26e-7d16fbe56762","Type":"ContainerStarted","Data":"f31cc636c03b26601a2ce3bfa005bf51b579fe7339406896517428eaae2b3e9c"} Apr 17 15:19:20.571247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:20.571228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/0.log" Apr 17 15:19:20.571345 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:20.571264 2577 generic.go:358] "Generic (PLEG): container finished" podID="85072088-8af2-4219-80f7-6a18460c13cf" containerID="cff006c66127f50baddaf4a1ce41eb17dec0440c82622e81e70c2f2135c452e3" exitCode=255 Apr 17 15:19:20.571345 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:20.571318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" event={"ID":"85072088-8af2-4219-80f7-6a18460c13cf","Type":"ContainerDied","Data":"cff006c66127f50baddaf4a1ce41eb17dec0440c82622e81e70c2f2135c452e3"} Apr 17 15:19:20.571502 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:20.571489 2577 scope.go:117] "RemoveContainer" containerID="cff006c66127f50baddaf4a1ce41eb17dec0440c82622e81e70c2f2135c452e3" Apr 17 15:19:20.582819 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:20.582739 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" podStartSLOduration=0.923600995 podStartE2EDuration="3.582726844s" podCreationTimestamp="2026-04-17 15:19:17 +0000 UTC" firstStartedPulling="2026-04-17 15:19:17.615400428 +0000 UTC m=+122.157065272" lastFinishedPulling="2026-04-17 15:19:20.274526275 +0000 UTC m=+124.816191121" observedRunningTime="2026-04-17 15:19:20.582443629 +0000 UTC m=+125.124108498" watchObservedRunningTime="2026-04-17 15:19:20.582726844 +0000 UTC m=+125.124391712" Apr 17 15:19:20.900045 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:20.900018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:20.900144 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:20.900118 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 15:19:20.900183 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:20.900167 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls podName:7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:24.900153868 +0000 UTC m=+129.441818713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6xtb" (UID: "7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25") : secret "samples-operator-tls" not found Apr 17 15:19:21.574848 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:21.574821 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/1.log" Apr 17 15:19:21.575266 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:21.575171 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/0.log" Apr 17 15:19:21.575266 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:21.575210 2577 generic.go:358] "Generic (PLEG): container finished" podID="85072088-8af2-4219-80f7-6a18460c13cf" containerID="b58a28d6ade21b0900fab43a3c3ce8bb5f3a82f7583f9a08d79d721dbfaddfc6" exitCode=255 Apr 17 15:19:21.575372 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:21.575320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" event={"ID":"85072088-8af2-4219-80f7-6a18460c13cf","Type":"ContainerDied","Data":"b58a28d6ade21b0900fab43a3c3ce8bb5f3a82f7583f9a08d79d721dbfaddfc6"} Apr 17 15:19:21.575423 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:21.575370 2577 scope.go:117] "RemoveContainer" containerID="cff006c66127f50baddaf4a1ce41eb17dec0440c82622e81e70c2f2135c452e3" Apr 17 15:19:21.575510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:21.575491 2577 scope.go:117] "RemoveContainer" containerID="b58a28d6ade21b0900fab43a3c3ce8bb5f3a82f7583f9a08d79d721dbfaddfc6" Apr 17 15:19:21.575725 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:21.575705 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdxgz_openshift-console-operator(85072088-8af2-4219-80f7-6a18460c13cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" podUID="85072088-8af2-4219-80f7-6a18460c13cf" Apr 17 15:19:22.579682 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:22.579653 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/1.log" Apr 17 15:19:22.580111 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:22.580023 2577 scope.go:117] "RemoveContainer" containerID="b58a28d6ade21b0900fab43a3c3ce8bb5f3a82f7583f9a08d79d721dbfaddfc6" Apr 17 15:19:22.580286 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:22.580265 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdxgz_openshift-console-operator(85072088-8af2-4219-80f7-6a18460c13cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" podUID="85072088-8af2-4219-80f7-6a18460c13cf" Apr 17 15:19:24.729314 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:24.729285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:19:24.729676 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:24.729427 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 15:19:24.729676 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:24.729489 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs podName:4445020e-d73c-4a2d-9f40-1c3fc286490e nodeName:}" failed. No retries permitted until 2026-04-17 15:21:26.729474256 +0000 UTC m=+251.271139105 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs") pod "network-metrics-daemon-j7zl6" (UID: "4445020e-d73c-4a2d-9f40-1c3fc286490e") : secret "metrics-daemon-secret" not found Apr 17 15:19:24.931579 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:24.931542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:24.931769 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:24.931696 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 15:19:24.931769 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:24.931759 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls podName:7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:32.93174235 +0000 UTC m=+137.473407225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6xtb" (UID: "7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25") : secret "samples-operator-tls" not found Apr 17 15:19:27.491568 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:27.491528 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:27.491568 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:27.491573 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:27.491985 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:27.491896 2577 scope.go:117] "RemoveContainer" containerID="b58a28d6ade21b0900fab43a3c3ce8bb5f3a82f7583f9a08d79d721dbfaddfc6" Apr 17 15:19:27.492103 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:27.492082 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdxgz_openshift-console-operator(85072088-8af2-4219-80f7-6a18460c13cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" podUID="85072088-8af2-4219-80f7-6a18460c13cf" Apr 17 15:19:32.994936 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:32.994908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:32.997340 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:32.997319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6xtb\" (UID: \"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:33.094584 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:33.094558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" Apr 17 15:19:33.210001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:33.209971 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb"] Apr 17 15:19:33.608049 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:33.608006 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" event={"ID":"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25","Type":"ContainerStarted","Data":"d4fe8863bc5e7b1003744d6135b335e762615b43555cb9dc6bd53f4618d72d70"} Apr 17 15:19:35.615021 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:35.614988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" event={"ID":"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25","Type":"ContainerStarted","Data":"130b1faea646a6f5dc93e3b0da5d6db14c6305957c608e55d53034190f3e1796"} Apr 17 15:19:35.615378 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:35.615026 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" event={"ID":"7d8cf2c1-33fa-4d78-a7fa-ea6e23d19c25","Type":"ContainerStarted","Data":"7c7e64389f051777892ee9e8ab00a9ae6ee9cb8e10cc94e52e8155ca37e9bf93"} Apr 17 15:19:35.629946 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:35.629903 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6xtb" podStartSLOduration=16.889583974 podStartE2EDuration="18.629891359s" podCreationTimestamp="2026-04-17 15:19:17 +0000 UTC" firstStartedPulling="2026-04-17 15:19:33.25165093 +0000 UTC m=+137.793315776" lastFinishedPulling="2026-04-17 15:19:34.991958314 +0000 UTC m=+139.533623161" observedRunningTime="2026-04-17 15:19:35.628707377 +0000 UTC m=+140.170372249" watchObservedRunningTime="2026-04-17 15:19:35.629891359 +0000 UTC m=+140.171556227" Apr 17 15:19:42.057984 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:42.057957 2577 scope.go:117] "RemoveContainer" containerID="b58a28d6ade21b0900fab43a3c3ce8bb5f3a82f7583f9a08d79d721dbfaddfc6" Apr 17 15:19:42.635078 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:42.635048 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:19:42.635435 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:42.635415 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/1.log" Apr 17 15:19:42.635530 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:42.635457 2577 generic.go:358] "Generic (PLEG): container finished" podID="85072088-8af2-4219-80f7-6a18460c13cf" containerID="946a3196d08658242b10d5b451d1a8cae7e6349670453fb64834c15773de0191" exitCode=255 Apr 17 15:19:42.635530 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:42.635515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" event={"ID":"85072088-8af2-4219-80f7-6a18460c13cf","Type":"ContainerDied","Data":"946a3196d08658242b10d5b451d1a8cae7e6349670453fb64834c15773de0191"} Apr 17 15:19:42.635627 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:42.635556 2577 scope.go:117] "RemoveContainer" containerID="b58a28d6ade21b0900fab43a3c3ce8bb5f3a82f7583f9a08d79d721dbfaddfc6" Apr 17 15:19:42.635816 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:42.635797 2577 scope.go:117] "RemoveContainer" containerID="946a3196d08658242b10d5b451d1a8cae7e6349670453fb64834c15773de0191" Apr 17 15:19:42.635990 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:42.635969 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdxgz_openshift-console-operator(85072088-8af2-4219-80f7-6a18460c13cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" podUID="85072088-8af2-4219-80f7-6a18460c13cf" Apr 17 15:19:43.639383 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:43.639354 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:19:46.515182 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.515153 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm"] Apr 17 15:19:46.519089 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.519074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" Apr 17 15:19:46.521401 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.521382 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-rp4bk\"" Apr 17 15:19:46.528591 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.528571 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm"] Apr 17 15:19:46.590197 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.590157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbccq\" (UniqueName: \"kubernetes.io/projected/8f30a8c1-574b-4685-88d7-2b714bdf287f-kube-api-access-fbccq\") pod \"network-check-source-8894fc9bd-k9sfm\" (UID: \"8f30a8c1-574b-4685-88d7-2b714bdf287f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" Apr 17 15:19:46.604547 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.604518 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-997f69ccf-rnb69"] Apr 17 15:19:46.604712 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:46.604692 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-997f69ccf-rnb69" podUID="74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" Apr 17 15:19:46.615764 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.615741 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn"] Apr 17 15:19:46.618665 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.618636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" Apr 17 15:19:46.620796 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.620776 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-pbqj6\"" Apr 17 15:19:46.620796 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.620777 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 15:19:46.628833 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.628814 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-89cc9"] Apr 17 15:19:46.631696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.631679 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn"] Apr 17 15:19:46.631786 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.631777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.634346 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.634282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 15:19:46.634346 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.634332 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 15:19:46.634500 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.634314 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 15:19:46.634500 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.634467 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 15:19:46.634684 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.634665 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ctp72\"" Apr 17 15:19:46.643497 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.643478 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-89cc9"] Apr 17 15:19:46.646465 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.646444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:19:46.650817 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.650801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:19:46.690745 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.690707 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-certificates\") pod \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " Apr 17 15:19:46.690909 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.690753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-installation-pull-secrets\") pod \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " Apr 17 15:19:46.690909 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.690868 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-trusted-ca\") pod \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " Apr 17 15:19:46.690991 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.690918 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-ca-trust-extracted\") pod \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " Apr 17 15:19:46.690991 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.690970 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d6jc\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-kube-api-access-5d6jc\") pod \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " Apr 17 15:19:46.691098 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691001 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-bound-sa-token\") pod \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " Apr 17 15:19:46.691098 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691018 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:19:46.691098 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691070 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-image-registry-private-configuration\") pod \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\" (UID: \"74f902bb-0a1c-46ed-bb9a-32e7a254c7b6\") " Apr 17 15:19:46.691248 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.691248 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691166 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:19:46.691248 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.691248 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691229 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:19:46.691445 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-data-volume\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.691445 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbccq\" (UniqueName: \"kubernetes.io/projected/8f30a8c1-574b-4685-88d7-2b714bdf287f-kube-api-access-fbccq\") pod \"network-check-source-8894fc9bd-k9sfm\" (UID: \"8f30a8c1-574b-4685-88d7-2b714bdf287f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" Apr 17 15:19:46.691445 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-crio-socket\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.691585 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691470 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2nz\" (UniqueName: \"kubernetes.io/projected/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-kube-api-access-bk2nz\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.691585 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/93545302-4bda-45c5-9cb7-2e69c294a279-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t5wmn\" (UID: \"93545302-4bda-45c5-9cb7-2e69c294a279\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" Apr 17 15:19:46.691585 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691533 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-trusted-ca\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:46.691585 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691560 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-ca-trust-extracted\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:46.691585 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.691577 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-certificates\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:46.693196 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.693165 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:19:46.693365 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.693347 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:19:46.693416 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.693390 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:19:46.693559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.693539 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-kube-api-access-5d6jc" (OuterVolumeSpecName: "kube-api-access-5d6jc") pod "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" (UID: "74f902bb-0a1c-46ed-bb9a-32e7a254c7b6"). InnerVolumeSpecName "kube-api-access-5d6jc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:19:46.702688 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.702659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbccq\" (UniqueName: \"kubernetes.io/projected/8f30a8c1-574b-4685-88d7-2b714bdf287f-kube-api-access-fbccq\") pod \"network-check-source-8894fc9bd-k9sfm\" (UID: \"8f30a8c1-574b-4685-88d7-2b714bdf287f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" Apr 17 15:19:46.718169 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.718145 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-664f745c79-lz8j5"] Apr 17 15:19:46.722253 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.722239 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.732890 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.732859 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-664f745c79-lz8j5"] Apr 17 15:19:46.792519 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-registry-tls\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.792519 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b884775-56c3-4983-a77d-ac6ef0438ae4-installation-pull-secrets\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.792519 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-bound-sa-token\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.792729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-data-volume\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.792729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b884775-56c3-4983-a77d-ac6ef0438ae4-image-registry-private-configuration\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.792729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b884775-56c3-4983-a77d-ac6ef0438ae4-registry-certificates\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.792729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-crio-socket\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.792729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2nz\" (UniqueName: \"kubernetes.io/projected/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-kube-api-access-bk2nz\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.792729 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/93545302-4bda-45c5-9cb7-2e69c294a279-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t5wmn\" (UID: \"93545302-4bda-45c5-9cb7-2e69c294a279\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" Apr 17 15:19:46.792950 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.792950 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-crio-socket\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.792950 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.792950 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-data-volume\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.792950 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pklfm\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-kube-api-access-pklfm\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.793208 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.792992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b884775-56c3-4983-a77d-ac6ef0438ae4-trusted-ca\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.793208 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.793019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b884775-56c3-4983-a77d-ac6ef0438ae4-ca-trust-extracted\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.793208 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.793107 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-bound-sa-token\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:46.793208 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.793127 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-image-registry-private-configuration\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:46.793208 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.793143 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-installation-pull-secrets\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:46.793208 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.793162 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5d6jc\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-kube-api-access-5d6jc\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:46.793448 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.793399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.795301 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.795277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.795416 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.795391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/93545302-4bda-45c5-9cb7-2e69c294a279-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t5wmn\" (UID: \"93545302-4bda-45c5-9cb7-2e69c294a279\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" Apr 17 15:19:46.801342 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.801319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2nz\" (UniqueName: \"kubernetes.io/projected/19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4-kube-api-access-bk2nz\") pod \"insights-runtime-extractor-89cc9\" (UID: \"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4\") " pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.827157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.827130 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" Apr 17 15:19:46.893596 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b884775-56c3-4983-a77d-ac6ef0438ae4-image-registry-private-configuration\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.893774 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b884775-56c3-4983-a77d-ac6ef0438ae4-registry-certificates\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.893774 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pklfm\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-kube-api-access-pklfm\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.893774 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b884775-56c3-4983-a77d-ac6ef0438ae4-trusted-ca\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.893774 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b884775-56c3-4983-a77d-ac6ef0438ae4-ca-trust-extracted\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.893974 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-registry-tls\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.893974 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b884775-56c3-4983-a77d-ac6ef0438ae4-installation-pull-secrets\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.893974 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.893820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-bound-sa-token\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.894212 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.894186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b884775-56c3-4983-a77d-ac6ef0438ae4-ca-trust-extracted\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.894542 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.894505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b884775-56c3-4983-a77d-ac6ef0438ae4-registry-certificates\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.894866 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.894848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b884775-56c3-4983-a77d-ac6ef0438ae4-trusted-ca\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.896642 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.896619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b884775-56c3-4983-a77d-ac6ef0438ae4-image-registry-private-configuration\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.897083 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.897066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b884775-56c3-4983-a77d-ac6ef0438ae4-installation-pull-secrets\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.897146 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.897128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-registry-tls\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.902652 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.902624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-bound-sa-token\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.903153 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.902915 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pklfm\" (UniqueName: \"kubernetes.io/projected/3b884775-56c3-4983-a77d-ac6ef0438ae4-kube-api-access-pklfm\") pod \"image-registry-664f745c79-lz8j5\" (UID: \"3b884775-56c3-4983-a77d-ac6ef0438ae4\") " pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:46.926851 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.926819 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" Apr 17 15:19:46.940647 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.940617 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-89cc9" Apr 17 15:19:46.948997 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:46.948972 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm"] Apr 17 15:19:46.951861 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:46.951833 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f30a8c1_574b_4685_88d7_2b714bdf287f.slice/crio-3fdab7c4b1604692170496c4254b183e910a9901b118335eba1ad0625f1b01b1 WatchSource:0}: Error finding container 3fdab7c4b1604692170496c4254b183e910a9901b118335eba1ad0625f1b01b1: Status 404 returned error can't find the container with id 3fdab7c4b1604692170496c4254b183e910a9901b118335eba1ad0625f1b01b1 Apr 17 15:19:47.033226 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.033013 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5jck6\"" Apr 17 15:19:47.041320 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.041273 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:47.062762 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.062728 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn"] Apr 17 15:19:47.067489 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:47.067442 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93545302_4bda_45c5_9cb7_2e69c294a279.slice/crio-515e4f8141713374b8b6ce572f9aaa86a86e49618a514e4a9a1f895851867e61 WatchSource:0}: Error finding container 515e4f8141713374b8b6ce572f9aaa86a86e49618a514e4a9a1f895851867e61: Status 404 returned error can't find the container with id 515e4f8141713374b8b6ce572f9aaa86a86e49618a514e4a9a1f895851867e61 Apr 17 15:19:47.086939 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.086787 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-89cc9"] Apr 17 15:19:47.090005 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:47.089963 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f0cf6c_7fd6_48bc_b7f5_6fbd50fd92e4.slice/crio-157ac3e9f7afbb4563a1ddcafa00e4079763f3cef2c23ec28b7fd0060c125cca WatchSource:0}: Error finding container 157ac3e9f7afbb4563a1ddcafa00e4079763f3cef2c23ec28b7fd0060c125cca: Status 404 returned error can't find the container with id 157ac3e9f7afbb4563a1ddcafa00e4079763f3cef2c23ec28b7fd0060c125cca Apr 17 15:19:47.195553 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.195496 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-664f745c79-lz8j5"] Apr 17 15:19:47.199834 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:47.199807 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b884775_56c3_4983_a77d_ac6ef0438ae4.slice/crio-6fd42f161b24e691f06f0f29ea9d1994fb383883918bb282a5350806cd2e2ce5 WatchSource:0}: Error finding container 6fd42f161b24e691f06f0f29ea9d1994fb383883918bb282a5350806cd2e2ce5: Status 404 returned error can't find the container with id 6fd42f161b24e691f06f0f29ea9d1994fb383883918bb282a5350806cd2e2ce5 Apr 17 15:19:47.491576 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.491537 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:47.491576 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.491578 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:19:47.491965 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.491950 2577 scope.go:117] "RemoveContainer" containerID="946a3196d08658242b10d5b451d1a8cae7e6349670453fb64834c15773de0191" Apr 17 15:19:47.492187 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:47.492170 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdxgz_openshift-console-operator(85072088-8af2-4219-80f7-6a18460c13cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" podUID="85072088-8af2-4219-80f7-6a18460c13cf" Apr 17 15:19:47.651990 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.651925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" event={"ID":"93545302-4bda-45c5-9cb7-2e69c294a279","Type":"ContainerStarted","Data":"515e4f8141713374b8b6ce572f9aaa86a86e49618a514e4a9a1f895851867e61"} Apr 17 15:19:47.653975 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.653946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" event={"ID":"8f30a8c1-574b-4685-88d7-2b714bdf287f","Type":"ContainerStarted","Data":"c50a79343567ed08f8f72c0949cff85c822193e657049f2c6e0ed6f6ac8c1b2d"} Apr 17 15:19:47.654109 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.653982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" event={"ID":"8f30a8c1-574b-4685-88d7-2b714bdf287f","Type":"ContainerStarted","Data":"3fdab7c4b1604692170496c4254b183e910a9901b118335eba1ad0625f1b01b1"} Apr 17 15:19:47.663408 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.663377 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" event={"ID":"3b884775-56c3-4983-a77d-ac6ef0438ae4","Type":"ContainerStarted","Data":"b7c9492e32fc7e5ca58e1305da3573a6709c95b516eba6ae2f8dfaa9702d87ee"} Apr 17 15:19:47.663570 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.663419 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" event={"ID":"3b884775-56c3-4983-a77d-ac6ef0438ae4","Type":"ContainerStarted","Data":"6fd42f161b24e691f06f0f29ea9d1994fb383883918bb282a5350806cd2e2ce5"} Apr 17 15:19:47.663570 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.663499 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:19:47.666145 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.666110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89cc9" event={"ID":"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4","Type":"ContainerStarted","Data":"38856364ab39408a983341e5e3cf8a4eb4de12fe8517ed8271f7556b67fc0ed0"} Apr 17 15:19:47.666348 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.666331 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89cc9" event={"ID":"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4","Type":"ContainerStarted","Data":"157ac3e9f7afbb4563a1ddcafa00e4079763f3cef2c23ec28b7fd0060c125cca"} Apr 17 15:19:47.666473 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.666161 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-997f69ccf-rnb69" Apr 17 15:19:47.675946 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.675901 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-k9sfm" podStartSLOduration=1.675873789 podStartE2EDuration="1.675873789s" podCreationTimestamp="2026-04-17 15:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:19:47.67581713 +0000 UTC m=+152.217481998" watchObservedRunningTime="2026-04-17 15:19:47.675873789 +0000 UTC m=+152.217538659" Apr 17 15:19:47.703918 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.703872 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-997f69ccf-rnb69"] Apr 17 15:19:47.707489 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.707462 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-997f69ccf-rnb69"] Apr 17 15:19:47.724715 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.724410 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" podStartSLOduration=1.72439304 podStartE2EDuration="1.72439304s" podCreationTimestamp="2026-04-17 15:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:19:47.723083973 +0000 UTC m=+152.264748841" watchObservedRunningTime="2026-04-17 15:19:47.72439304 +0000 UTC m=+152.266057900" Apr 17 15:19:47.803696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:47.803628 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6-registry-tls\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:19:48.061517 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:48.061427 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f902bb-0a1c-46ed-bb9a-32e7a254c7b6" path="/var/lib/kubelet/pods/74f902bb-0a1c-46ed-bb9a-32e7a254c7b6/volumes" Apr 17 15:19:48.670657 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:48.670623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89cc9" event={"ID":"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4","Type":"ContainerStarted","Data":"d33324cd8587ae43bb7fea1e6a4f8d273cec222a8f0099ba0f52281d00c0f4dd"} Apr 17 15:19:48.672111 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:48.672078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" event={"ID":"93545302-4bda-45c5-9cb7-2e69c294a279","Type":"ContainerStarted","Data":"3ff922a233335d5441f7b3babab1b59c14575c12a6971cef54651183b884c287"} Apr 17 15:19:48.685304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:48.685257 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" podStartSLOduration=1.612599095 podStartE2EDuration="2.685246796s" podCreationTimestamp="2026-04-17 15:19:46 +0000 UTC" firstStartedPulling="2026-04-17 15:19:47.070464348 +0000 UTC m=+151.612129194" lastFinishedPulling="2026-04-17 15:19:48.14311205 +0000 UTC m=+152.684776895" observedRunningTime="2026-04-17 15:19:48.685068547 +0000 UTC m=+153.226733414" watchObservedRunningTime="2026-04-17 15:19:48.685246796 +0000 UTC m=+153.226911662" Apr 17 15:19:49.676738 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:49.676665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89cc9" event={"ID":"19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4","Type":"ContainerStarted","Data":"e469fda0cf7e33dfa8dfcff14c392049bfc513da2e86d2c23be69653ab603f8f"} Apr 17 15:19:49.677078 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:49.676845 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" Apr 17 15:19:49.681559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:49.681529 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t5wmn" Apr 17 15:19:49.692891 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:49.692842 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-89cc9" podStartSLOduration=1.4912310579999999 podStartE2EDuration="3.692830031s" podCreationTimestamp="2026-04-17 15:19:46 +0000 UTC" firstStartedPulling="2026-04-17 15:19:47.176080321 +0000 UTC m=+151.717745166" lastFinishedPulling="2026-04-17 15:19:49.377679291 +0000 UTC m=+153.919344139" observedRunningTime="2026-04-17 15:19:49.691572472 +0000 UTC m=+154.233237338" watchObservedRunningTime="2026-04-17 15:19:49.692830031 +0000 UTC m=+154.234494898" Apr 17 15:19:51.871544 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:51.871508 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dl2zj" podUID="05352b16-4fb2-4f4e-894f-d69b17f92924" Apr 17 15:19:51.876765 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:51.876742 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2vpk9" podUID="4f1d2ee5-9f9b-4086-afee-0e043df76f02" Apr 17 15:19:52.683230 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:52.683204 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:19:52.683411 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:52.683215 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2vpk9" Apr 17 15:19:53.087638 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:53.087609 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-j7zl6" podUID="4445020e-d73c-4a2d-9f40-1c3fc286490e" Apr 17 15:19:54.538517 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.538490 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mjkbk"] Apr 17 15:19:54.575119 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.575089 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vb2bb"] Apr 17 15:19:54.575256 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.575176 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.578712 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.578690 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 15:19:54.579643 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.579614 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 15:19:54.579747 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.579642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 15:19:54.579747 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.579669 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 15:19:54.579747 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.579709 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-6bzn8\"" Apr 17 15:19:54.579881 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.579757 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 15:19:54.579998 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.579978 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 15:19:54.595307 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.595289 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mjkbk"] Apr 17 15:19:54.595394 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.595383 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.598730 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.598713 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 15:19:54.598828 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.598760 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-95j6v\"" Apr 17 15:19:54.598828 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.598799 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 15:19:54.599151 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.599137 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 15:19:54.652406 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-textfile\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652509 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c4b76af8-7aae-4de0-be95-221109f82fb9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.652509 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-wtmp\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652509 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652445 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkc4\" (UniqueName: \"kubernetes.io/projected/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-api-access-7tkc4\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.652509 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-tls\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652699 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-root\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652699 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-sys\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652699 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b76af8-7aae-4de0-be95-221109f82fb9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.652811 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.652811 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-accelerators-collector-config\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.652874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.652874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652991 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdg98\" (UniqueName: \"kubernetes.io/projected/0210dbd3-bf15-4240-a299-f959cf307c04-kube-api-access-jdg98\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.652991 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.652916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0210dbd3-bf15-4240-a299-f959cf307c04-metrics-client-ca\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753288 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdg98\" (UniqueName: \"kubernetes.io/projected/0210dbd3-bf15-4240-a299-f959cf307c04-kube-api-access-jdg98\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0210dbd3-bf15-4240-a299-f959cf307c04-metrics-client-ca\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-textfile\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c4b76af8-7aae-4de0-be95-221109f82fb9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.753377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-wtmp\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753541 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkc4\" (UniqueName: \"kubernetes.io/projected/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-api-access-7tkc4\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.753541 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-tls\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753541 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-root\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753541 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-sys\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753541 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-sys\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-root\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-wtmp\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b76af8-7aae-4de0-be95-221109f82fb9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-accelerators-collector-config\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-textfile\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.753782 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.754201 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c4b76af8-7aae-4de0-be95-221109f82fb9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.754201 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.754201 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:54.753819 2577 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 15:19:54.754201 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:54.753883 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-tls podName:c4b76af8-7aae-4de0-be95-221109f82fb9 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:55.253866795 +0000 UTC m=+159.795531639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-mjkbk" (UID: "c4b76af8-7aae-4de0-be95-221109f82fb9") : secret "kube-state-metrics-tls" not found Apr 17 15:19:54.754201 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.753932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0210dbd3-bf15-4240-a299-f959cf307c04-metrics-client-ca\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.754620 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.754360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-accelerators-collector-config\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.754620 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.754444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b76af8-7aae-4de0-be95-221109f82fb9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.754786 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.754755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.756135 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.756111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-tls\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.756234 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.756217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0210dbd3-bf15-4240-a299-f959cf307c04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.756284 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.756268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.766880 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.766859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkc4\" (UniqueName: \"kubernetes.io/projected/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-api-access-7tkc4\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:54.767016 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.766999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdg98\" (UniqueName: \"kubernetes.io/projected/0210dbd3-bf15-4240-a299-f959cf307c04-kube-api-access-jdg98\") pod \"node-exporter-vb2bb\" (UID: \"0210dbd3-bf15-4240-a299-f959cf307c04\") " pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.903762 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:54.903729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vb2bb" Apr 17 15:19:54.911748 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:54.911727 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0210dbd3_bf15_4240_a299_f959cf307c04.slice/crio-0731f68a7b61d17ac159d34d858162fbb336396910c083d80e8ff4abc2d41806 WatchSource:0}: Error finding container 0731f68a7b61d17ac159d34d858162fbb336396910c083d80e8ff4abc2d41806: Status 404 returned error can't find the container with id 0731f68a7b61d17ac159d34d858162fbb336396910c083d80e8ff4abc2d41806 Apr 17 15:19:55.257358 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:55.257300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:55.259800 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:55.259769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4b76af8-7aae-4de0-be95-221109f82fb9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjkbk\" (UID: \"c4b76af8-7aae-4de0-be95-221109f82fb9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:55.484855 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:55.484830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" Apr 17 15:19:55.622851 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:55.622814 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mjkbk"] Apr 17 15:19:55.694247 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:55.694213 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vb2bb" event={"ID":"0210dbd3-bf15-4240-a299-f959cf307c04","Type":"ContainerStarted","Data":"0731f68a7b61d17ac159d34d858162fbb336396910c083d80e8ff4abc2d41806"} Apr 17 15:19:55.703541 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:55.703518 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4b76af8_7aae_4de0_be95_221109f82fb9.slice/crio-3a280f838041b03f2a4654974fbe6a1c28139152791210231dd77d48173c1498 WatchSource:0}: Error finding container 3a280f838041b03f2a4654974fbe6a1c28139152791210231dd77d48173c1498: Status 404 returned error can't find the container with id 3a280f838041b03f2a4654974fbe6a1c28139152791210231dd77d48173c1498 Apr 17 15:19:56.698313 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.698278 2577 generic.go:358] "Generic (PLEG): container finished" podID="0210dbd3-bf15-4240-a299-f959cf307c04" containerID="0cf24035dc9c7e28e8e4143cfc2eae0b9ddc2a6f0d215cb6bd6b97fc48792812" exitCode=0 Apr 17 15:19:56.698739 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.698347 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vb2bb" event={"ID":"0210dbd3-bf15-4240-a299-f959cf307c04","Type":"ContainerDied","Data":"0cf24035dc9c7e28e8e4143cfc2eae0b9ddc2a6f0d215cb6bd6b97fc48792812"} Apr 17 15:19:56.699617 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.699588 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" event={"ID":"c4b76af8-7aae-4de0-be95-221109f82fb9","Type":"ContainerStarted","Data":"3a280f838041b03f2a4654974fbe6a1c28139152791210231dd77d48173c1498"} Apr 17 15:19:56.767890 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.767858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:19:56.768016 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.767945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:19:56.770802 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.770755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f1d2ee5-9f9b-4086-afee-0e043df76f02-metrics-tls\") pod \"dns-default-2vpk9\" (UID: \"4f1d2ee5-9f9b-4086-afee-0e043df76f02\") " pod="openshift-dns/dns-default-2vpk9" Apr 17 15:19:56.770922 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.770802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05352b16-4fb2-4f4e-894f-d69b17f92924-cert\") pod \"ingress-canary-dl2zj\" (UID: \"05352b16-4fb2-4f4e-894f-d69b17f92924\") " pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:19:56.886298 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.886270 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8skx\"" Apr 17 15:19:56.886423 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.886269 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wwl79\"" Apr 17 15:19:56.895098 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.895076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dl2zj" Apr 17 15:19:56.895204 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:56.895097 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2vpk9" Apr 17 15:19:57.036482 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.036453 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2vpk9"] Apr 17 15:19:57.041235 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:57.041207 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1d2ee5_9f9b_4086_afee_0e043df76f02.slice/crio-42792e08fe2ff8a28e6d9d8764b7587161a7099ff0eb6de7fed5115149e40c68 WatchSource:0}: Error finding container 42792e08fe2ff8a28e6d9d8764b7587161a7099ff0eb6de7fed5115149e40c68: Status 404 returned error can't find the container with id 42792e08fe2ff8a28e6d9d8764b7587161a7099ff0eb6de7fed5115149e40c68 Apr 17 15:19:57.053995 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.053970 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dl2zj"] Apr 17 15:19:57.057393 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:57.057362 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05352b16_4fb2_4f4e_894f_d69b17f92924.slice/crio-cddd3c4c1d651bf49e04ac7f1819a34b048a8ec2ad4841bc24bf305a336786a9 WatchSource:0}: Error finding container cddd3c4c1d651bf49e04ac7f1819a34b048a8ec2ad4841bc24bf305a336786a9: Status 404 returned error can't find the container with id cddd3c4c1d651bf49e04ac7f1819a34b048a8ec2ad4841bc24bf305a336786a9 Apr 17 15:19:57.518594 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.518556 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-69676d6c77-k8jwr"] Apr 17 15:19:57.522064 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.522027 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.524551 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.524525 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-6rc9k\"" Apr 17 15:19:57.524680 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.524531 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7aev1v63jahuv\"" Apr 17 15:19:57.524680 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.524631 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 15:19:57.524680 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.524531 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 15:19:57.524680 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.524669 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 15:19:57.524896 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.524766 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 15:19:57.524896 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.524769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 15:19:57.533630 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.533608 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-69676d6c77-k8jwr"] Apr 17 15:19:57.573871 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.573843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-grpc-tls\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.573871 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.573891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkxzl\" (UniqueName: \"kubernetes.io/projected/528a42f2-ecbc-4039-ab6e-92181c86c155-kube-api-access-xkxzl\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.574105 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.573925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.574105 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.573993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.574105 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.574024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.574292 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.574111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-tls\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.574292 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.574157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.574292 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.574203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528a42f2-ecbc-4039-ab6e-92181c86c155-metrics-client-ca\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.674907 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.674878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.675060 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.674924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.675060 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.674960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-tls\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.675060 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.674981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.675060 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.675012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528a42f2-ecbc-4039-ab6e-92181c86c155-metrics-client-ca\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.675291 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.675076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-grpc-tls\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.675291 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.675095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkxzl\" (UniqueName: \"kubernetes.io/projected/528a42f2-ecbc-4039-ab6e-92181c86c155-kube-api-access-xkxzl\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.675291 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.675112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.676861 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.676834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528a42f2-ecbc-4039-ab6e-92181c86c155-metrics-client-ca\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.678467 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.678441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.678566 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.678489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.678871 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.678842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.679126 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.679080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-tls\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.679225 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.679143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.680518 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.680495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/528a42f2-ecbc-4039-ab6e-92181c86c155-secret-grpc-tls\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.684112 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.684090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkxzl\" (UniqueName: \"kubernetes.io/projected/528a42f2-ecbc-4039-ab6e-92181c86c155-kube-api-access-xkxzl\") pod \"thanos-querier-69676d6c77-k8jwr\" (UID: \"528a42f2-ecbc-4039-ab6e-92181c86c155\") " pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.704179 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.704119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dl2zj" event={"ID":"05352b16-4fb2-4f4e-894f-d69b17f92924","Type":"ContainerStarted","Data":"cddd3c4c1d651bf49e04ac7f1819a34b048a8ec2ad4841bc24bf305a336786a9"} Apr 17 15:19:57.705964 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.705919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2vpk9" event={"ID":"4f1d2ee5-9f9b-4086-afee-0e043df76f02","Type":"ContainerStarted","Data":"42792e08fe2ff8a28e6d9d8764b7587161a7099ff0eb6de7fed5115149e40c68"} Apr 17 15:19:57.709690 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.708894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vb2bb" event={"ID":"0210dbd3-bf15-4240-a299-f959cf307c04","Type":"ContainerStarted","Data":"3dbb8a96c40c298813586c82e0287078e7566cf2dbbe7869754d2f9857894a9d"} Apr 17 15:19:57.709690 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.708923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vb2bb" event={"ID":"0210dbd3-bf15-4240-a299-f959cf307c04","Type":"ContainerStarted","Data":"36d575a38c91ba71dfcf59c0035e0a8dfa72f63c3da5b31234e0ae795d6f12ba"} Apr 17 15:19:57.712408 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.712386 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" event={"ID":"c4b76af8-7aae-4de0-be95-221109f82fb9","Type":"ContainerStarted","Data":"e48894ca02384c66879887052d1a5f9b17df7fcb888cb7a9bb72a74639216751"} Apr 17 15:19:57.712518 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.712418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" event={"ID":"c4b76af8-7aae-4de0-be95-221109f82fb9","Type":"ContainerStarted","Data":"789b17c68c946570579b8c78c2ee33f9a7284e531611144d4158e88baaf7a413"} Apr 17 15:19:57.712518 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.712433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" event={"ID":"c4b76af8-7aae-4de0-be95-221109f82fb9","Type":"ContainerStarted","Data":"7976e1f4b4ac7b360b75556ba60f39712a339c7be1029b05023ee03e92bcb01f"} Apr 17 15:19:57.725767 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.725710 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vb2bb" podStartSLOduration=2.892458884 podStartE2EDuration="3.725695676s" podCreationTimestamp="2026-04-17 15:19:54 +0000 UTC" firstStartedPulling="2026-04-17 15:19:54.91336822 +0000 UTC m=+159.455033065" lastFinishedPulling="2026-04-17 15:19:55.746605007 +0000 UTC m=+160.288269857" observedRunningTime="2026-04-17 15:19:57.725170901 +0000 UTC m=+162.266835783" watchObservedRunningTime="2026-04-17 15:19:57.725695676 +0000 UTC m=+162.267360543" Apr 17 15:19:57.742158 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.742115 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjkbk" podStartSLOduration=2.491678591 podStartE2EDuration="3.7421012s" podCreationTimestamp="2026-04-17 15:19:54 +0000 UTC" firstStartedPulling="2026-04-17 15:19:55.705227717 +0000 UTC m=+160.246892562" lastFinishedPulling="2026-04-17 15:19:56.955650309 +0000 UTC m=+161.497315171" observedRunningTime="2026-04-17 15:19:57.741853925 +0000 UTC m=+162.283518788" watchObservedRunningTime="2026-04-17 15:19:57.7421012 +0000 UTC m=+162.283766068" Apr 17 15:19:57.834424 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.834394 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:19:57.987391 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:57.987326 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-69676d6c77-k8jwr"] Apr 17 15:19:58.058405 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.058373 2577 scope.go:117] "RemoveContainer" containerID="946a3196d08658242b10d5b451d1a8cae7e6349670453fb64834c15773de0191" Apr 17 15:19:58.058743 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:19:58.058719 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdxgz_openshift-console-operator(85072088-8af2-4219-80f7-6a18460c13cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" podUID="85072088-8af2-4219-80f7-6a18460c13cf" Apr 17 15:19:58.716767 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.716720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" event={"ID":"528a42f2-ecbc-4039-ab6e-92181c86c155","Type":"ContainerStarted","Data":"0263a8fbef34a25d4a1ddc9ef526e894378904b32748ddfdf6e10589b6b3699b"} Apr 17 15:19:58.843387 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.843348 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7f6d6b796-8d99r"] Apr 17 15:19:58.846733 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.846711 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.848866 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.848845 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 15:19:58.849150 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.849133 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 15:19:58.849849 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.849822 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 15:19:58.850000 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.849982 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5uq2ef198a7ur\"" Apr 17 15:19:58.850104 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.849988 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 15:19:58.850104 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.850088 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-55gjx\"" Apr 17 15:19:58.856919 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.856898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f6d6b796-8d99r"] Apr 17 15:19:58.886210 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.886163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.886210 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.886210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-audit-log\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.886440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.886306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-secret-metrics-server-client-certs\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.886440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.886386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-secret-metrics-server-tls\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.886440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.886426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-metrics-server-audit-profiles\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.886598 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.886473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k486t\" (UniqueName: \"kubernetes.io/projected/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-kube-api-access-k486t\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.886598 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.886531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-client-ca-bundle\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.987613 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.987544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k486t\" (UniqueName: \"kubernetes.io/projected/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-kube-api-access-k486t\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.987613 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.987598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-client-ca-bundle\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.987789 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.987645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.987789 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.987663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-audit-log\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.987789 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.987721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-secret-metrics-server-client-certs\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.987789 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.987742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-secret-metrics-server-tls\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.987789 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.987761 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-metrics-server-audit-profiles\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.988272 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.988241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-audit-log\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.989068 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.988396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.989441 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.989417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-metrics-server-audit-profiles\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.990277 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.990248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-client-ca-bundle\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.990460 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.990442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-secret-metrics-server-tls\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.990559 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.990541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-secret-metrics-server-client-certs\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:58.994867 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:58.994845 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k486t\" (UniqueName: \"kubernetes.io/projected/eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8-kube-api-access-k486t\") pod \"metrics-server-7f6d6b796-8d99r\" (UID: \"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8\") " pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:59.158924 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.158895 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:19:59.334859 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.334831 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f6d6b796-8d99r"] Apr 17 15:19:59.339902 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:19:59.339866 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaafa6a6_9e9c_4379_ad58_cfaf54d8ccc8.slice/crio-e4d1468bb869d4883edfb4e7abe8ffcbee4c2bd67bc4b160dfa1665f72f10b36 WatchSource:0}: Error finding container e4d1468bb869d4883edfb4e7abe8ffcbee4c2bd67bc4b160dfa1665f72f10b36: Status 404 returned error can't find the container with id e4d1468bb869d4883edfb4e7abe8ffcbee4c2bd67bc4b160dfa1665f72f10b36 Apr 17 15:19:59.722831 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.722791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dl2zj" event={"ID":"05352b16-4fb2-4f4e-894f-d69b17f92924","Type":"ContainerStarted","Data":"ff2f901f6ddf2eae7e542e201ddb7963504ec9e661e8f57ce79d0d19ecb33a71"} Apr 17 15:19:59.724602 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.724570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2vpk9" event={"ID":"4f1d2ee5-9f9b-4086-afee-0e043df76f02","Type":"ContainerStarted","Data":"2fa06a8364c31835774b8b37f4426eb893cb8ecfe9759071a7e84dab521b675f"} Apr 17 15:19:59.724720 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.724610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2vpk9" event={"ID":"4f1d2ee5-9f9b-4086-afee-0e043df76f02","Type":"ContainerStarted","Data":"3cbfd7400e4057e86caec0376ce7afaceb6768095ef80d65d71c97f623b95d1d"} Apr 17 15:19:59.724720 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.724672 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2vpk9" Apr 17 15:19:59.725699 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.725676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" event={"ID":"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8","Type":"ContainerStarted","Data":"e4d1468bb869d4883edfb4e7abe8ffcbee4c2bd67bc4b160dfa1665f72f10b36"} Apr 17 15:19:59.737394 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.737346 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dl2zj" podStartSLOduration=129.767319142 podStartE2EDuration="2m11.737330572s" podCreationTimestamp="2026-04-17 15:17:48 +0000 UTC" firstStartedPulling="2026-04-17 15:19:57.059841184 +0000 UTC m=+161.601506030" lastFinishedPulling="2026-04-17 15:19:59.029852615 +0000 UTC m=+163.571517460" observedRunningTime="2026-04-17 15:19:59.735820038 +0000 UTC m=+164.277484934" watchObservedRunningTime="2026-04-17 15:19:59.737330572 +0000 UTC m=+164.278995441" Apr 17 15:19:59.751369 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.751302 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2vpk9" podStartSLOduration=129.768818092 podStartE2EDuration="2m11.751287438s" podCreationTimestamp="2026-04-17 15:17:48 +0000 UTC" firstStartedPulling="2026-04-17 15:19:57.043278208 +0000 UTC m=+161.584943053" lastFinishedPulling="2026-04-17 15:19:59.025747553 +0000 UTC m=+163.567412399" observedRunningTime="2026-04-17 15:19:59.750734492 +0000 UTC m=+164.292399359" watchObservedRunningTime="2026-04-17 15:19:59.751287438 +0000 UTC m=+164.292952307" Apr 17 15:19:59.809219 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.809189 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-85f567b6d8-trkdw"] Apr 17 15:19:59.812819 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.812802 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.815010 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.814992 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 15:19:59.815109 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.815012 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 15:19:59.815337 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.815321 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 15:19:59.815401 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.815331 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 15:19:59.815401 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.815339 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 15:19:59.816480 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.816464 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-wgvrd\"" Apr 17 15:19:59.823751 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.823733 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 15:19:59.827011 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.826991 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85f567b6d8-trkdw"] Apr 17 15:19:59.895132 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-telemeter-client-tls\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.895238 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-metrics-client-ca\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.895238 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-secret-telemeter-client\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.895343 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.895343 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mz8d\" (UniqueName: \"kubernetes.io/projected/f29adb65-f616-46d7-9d7e-90a6185b6533-kube-api-access-5mz8d\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.895343 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895322 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-federate-client-tls\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.895476 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-serving-certs-ca-bundle\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.895476 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.895399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.996762 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mz8d\" (UniqueName: \"kubernetes.io/projected/f29adb65-f616-46d7-9d7e-90a6185b6533-kube-api-access-5mz8d\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.996762 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-federate-client-tls\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.996976 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-serving-certs-ca-bundle\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.996976 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.996976 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-telemeter-client-tls\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.996976 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-metrics-client-ca\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.996976 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-secret-telemeter-client\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.997248 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.996981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.997962 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.997939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-metrics-client-ca\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.998113 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.998088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-serving-certs-ca-bundle\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:19:59.998369 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:19:59.998345 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f29adb65-f616-46d7-9d7e-90a6185b6533-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:20:00.000970 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.000948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-telemeter-client-tls\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:20:00.001079 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.000987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-secret-telemeter-client\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:20:00.001424 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.001402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:20:00.001491 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.001409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f29adb65-f616-46d7-9d7e-90a6185b6533-federate-client-tls\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:20:00.005377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.005357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mz8d\" (UniqueName: \"kubernetes.io/projected/f29adb65-f616-46d7-9d7e-90a6185b6533-kube-api-access-5mz8d\") pod \"telemeter-client-85f567b6d8-trkdw\" (UID: \"f29adb65-f616-46d7-9d7e-90a6185b6533\") " pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:20:00.122415 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.122077 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" Apr 17 15:20:00.673711 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.673682 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:20:00.677695 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.677673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.679744 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.679722 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 15:20:00.680070 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680677 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-957poubn82coc\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680713 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680723 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8cwzt\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680756 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680766 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680677 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 15:20:00.680874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.680712 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 15:20:00.681616 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.681142 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 15:20:00.681616 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.681242 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 15:20:00.681616 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.681273 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 15:20:00.684077 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.684052 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 15:20:00.699985 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.699965 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:20:00.805389 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805731 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.805829 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbr9\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-kube-api-access-lcbr9\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.805971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-config-out\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.806014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.806058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.806088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-config\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.806110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.806177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.806255 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.806195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-web-config\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907526 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907526 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907741 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907741 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907741 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907741 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbr9\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-kube-api-access-lcbr9\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907741 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.907741 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-config-out\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-config\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907888 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-web-config\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.907968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.908411 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.908382 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.909883 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.909549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.910832 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.910517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.910832 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.910813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-config-out\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.911280 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.911251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.911393 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.911376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-config\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.911474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.911617 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.911728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.911727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.911760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.912399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.913430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914147 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.913941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914647 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.914204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914647 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.914233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914647 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.914276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.914647 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.914638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-web-config\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.915157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.915137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.916101 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.916079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.916445 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.916426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbr9\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-kube-api-access-lcbr9\") pod \"prometheus-k8s-0\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:00.990418 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:00.990374 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:01.157687 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.157638 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:20:01.160947 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:20:01.160912 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03428d8d_f204_45cf_ab61_4a66cca8d24b.slice/crio-4c336103e338680e4ab3eb734aa681a4f460a4ade974fcff4c40a814c2ff8133 WatchSource:0}: Error finding container 4c336103e338680e4ab3eb734aa681a4f460a4ade974fcff4c40a814c2ff8133: Status 404 returned error can't find the container with id 4c336103e338680e4ab3eb734aa681a4f460a4ade974fcff4c40a814c2ff8133 Apr 17 15:20:01.172022 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.171935 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85f567b6d8-trkdw"] Apr 17 15:20:01.174325 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:20:01.174299 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf29adb65_f616_46d7_9d7e_90a6185b6533.slice/crio-53282dba29c5583095c3befb6dd113096db5991d79a60cf400dc606cf1173e95 WatchSource:0}: Error finding container 53282dba29c5583095c3befb6dd113096db5991d79a60cf400dc606cf1173e95: Status 404 returned error can't find the container with id 53282dba29c5583095c3befb6dd113096db5991d79a60cf400dc606cf1173e95 Apr 17 15:20:01.733290 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.733251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" event={"ID":"528a42f2-ecbc-4039-ab6e-92181c86c155","Type":"ContainerStarted","Data":"9b2161ebcfabc26f461e1cdd3c6dde30669703c49cbb86dc3ea23953fb07d41c"} Apr 17 15:20:01.733290 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.733295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" event={"ID":"528a42f2-ecbc-4039-ab6e-92181c86c155","Type":"ContainerStarted","Data":"bc19756b7bcbad5421019ee4cdf4f1a80d445f6e0762050112a73eb3b46c480f"} Apr 17 15:20:01.733522 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.733305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" event={"ID":"528a42f2-ecbc-4039-ab6e-92181c86c155","Type":"ContainerStarted","Data":"c93a7afe2f32825094acb777d526d865d006f093043bed6003647a1e79a74005"} Apr 17 15:20:01.734943 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.734912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" event={"ID":"eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8","Type":"ContainerStarted","Data":"dba679e8e028dd05cacc3f9500868041703e008c23e63913536342d61ff60dee"} Apr 17 15:20:01.736269 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.736241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" event={"ID":"f29adb65-f616-46d7-9d7e-90a6185b6533","Type":"ContainerStarted","Data":"53282dba29c5583095c3befb6dd113096db5991d79a60cf400dc606cf1173e95"} Apr 17 15:20:01.737484 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.737450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerStarted","Data":"4c336103e338680e4ab3eb734aa681a4f460a4ade974fcff4c40a814c2ff8133"} Apr 17 15:20:01.751620 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:01.751494 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" podStartSLOduration=2.082681775 podStartE2EDuration="3.751481971s" podCreationTimestamp="2026-04-17 15:19:58 +0000 UTC" firstStartedPulling="2026-04-17 15:19:59.341287257 +0000 UTC m=+163.882952102" lastFinishedPulling="2026-04-17 15:20:01.010087438 +0000 UTC m=+165.551752298" observedRunningTime="2026-04-17 15:20:01.750765393 +0000 UTC m=+166.292430271" watchObservedRunningTime="2026-04-17 15:20:01.751481971 +0000 UTC m=+166.293146887" Apr 17 15:20:02.744935 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:02.744851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" event={"ID":"528a42f2-ecbc-4039-ab6e-92181c86c155","Type":"ContainerStarted","Data":"54765b4308d1ff4d1ad3cc16b2c9401718fa64818a677c4575500db2164cbf56"} Apr 17 15:20:02.744935 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:02.744899 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" event={"ID":"528a42f2-ecbc-4039-ab6e-92181c86c155","Type":"ContainerStarted","Data":"fd0d2f64bf500a73e9b4d0076c241703622eb77a59c8302b33d6f78278b03b40"} Apr 17 15:20:02.744935 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:02.744910 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" event={"ID":"528a42f2-ecbc-4039-ab6e-92181c86c155","Type":"ContainerStarted","Data":"acec5469b257871ec03ef755f6645a2de0aca58750deadc96c03dbf6f5e2b941"} Apr 17 15:20:02.745514 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:02.744973 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:20:02.746390 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:02.746363 2577 generic.go:358] "Generic (PLEG): container finished" podID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" exitCode=0 Apr 17 15:20:02.746545 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:02.746455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815"} Apr 17 15:20:02.770299 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:02.768402 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" podStartSLOduration=1.40903511 podStartE2EDuration="5.768385185s" podCreationTimestamp="2026-04-17 15:19:57 +0000 UTC" firstStartedPulling="2026-04-17 15:19:57.993804278 +0000 UTC m=+162.535469137" lastFinishedPulling="2026-04-17 15:20:02.353154361 +0000 UTC m=+166.894819212" observedRunningTime="2026-04-17 15:20:02.766532228 +0000 UTC m=+167.308197106" watchObservedRunningTime="2026-04-17 15:20:02.768385185 +0000 UTC m=+167.310050053" Apr 17 15:20:03.752328 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:03.752234 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" event={"ID":"f29adb65-f616-46d7-9d7e-90a6185b6533","Type":"ContainerStarted","Data":"10801f66913e632b192c8f8873fcca1c83103ad89bb09e0bea3b8f10384aaa12"} Apr 17 15:20:03.752328 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:03.752280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" event={"ID":"f29adb65-f616-46d7-9d7e-90a6185b6533","Type":"ContainerStarted","Data":"e0d747cb73457b2e48c253ad5310450df6390cb922dc2ab030a5573d4143a2c4"} Apr 17 15:20:03.752328 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:03.752296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" event={"ID":"f29adb65-f616-46d7-9d7e-90a6185b6533","Type":"ContainerStarted","Data":"c18d52d63a08102b52b566d67b2434f2b1593afc25b7dbb184528fa51598176f"} Apr 17 15:20:03.772800 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:03.772738 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-85f567b6d8-trkdw" podStartSLOduration=2.8881638499999998 podStartE2EDuration="4.772718558s" podCreationTimestamp="2026-04-17 15:19:59 +0000 UTC" firstStartedPulling="2026-04-17 15:20:01.176367573 +0000 UTC m=+165.718032424" lastFinishedPulling="2026-04-17 15:20:03.060922284 +0000 UTC m=+167.602587132" observedRunningTime="2026-04-17 15:20:03.771973013 +0000 UTC m=+168.313637881" watchObservedRunningTime="2026-04-17 15:20:03.772718558 +0000 UTC m=+168.314383484" Apr 17 15:20:06.768941 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:06.768904 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerStarted","Data":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} Apr 17 15:20:06.769319 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:06.768948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerStarted","Data":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} Apr 17 15:20:06.769319 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:06.768964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerStarted","Data":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} Apr 17 15:20:06.769319 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:06.768976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerStarted","Data":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} Apr 17 15:20:06.769319 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:06.768988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerStarted","Data":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} Apr 17 15:20:06.769319 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:06.769000 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerStarted","Data":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} Apr 17 15:20:06.793468 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:06.793399 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.12158675 podStartE2EDuration="6.793378612s" podCreationTimestamp="2026-04-17 15:20:00 +0000 UTC" firstStartedPulling="2026-04-17 15:20:01.163067567 +0000 UTC m=+165.704732416" lastFinishedPulling="2026-04-17 15:20:05.834859426 +0000 UTC m=+170.376524278" observedRunningTime="2026-04-17 15:20:06.791068622 +0000 UTC m=+171.332733491" watchObservedRunningTime="2026-04-17 15:20:06.793378612 +0000 UTC m=+171.335043480" Apr 17 15:20:07.046011 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:07.045937 2577 patch_prober.go:28] interesting pod/image-registry-664f745c79-lz8j5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 15:20:07.046011 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:07.045994 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" podUID="3b884775-56c3-4983-a77d-ac6ef0438ae4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 15:20:07.057163 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:07.057140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:20:08.676501 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:08.676464 2577 patch_prober.go:28] interesting pod/image-registry-664f745c79-lz8j5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 15:20:08.676976 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:08.676520 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" podUID="3b884775-56c3-4983-a77d-ac6ef0438ae4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 15:20:08.758231 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:08.758200 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-69676d6c77-k8jwr" Apr 17 15:20:09.731075 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:09.731020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2vpk9" Apr 17 15:20:10.990552 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:10.990505 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:20:13.057511 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:13.057477 2577 scope.go:117] "RemoveContainer" containerID="946a3196d08658242b10d5b451d1a8cae7e6349670453fb64834c15773de0191" Apr 17 15:20:13.801908 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:13.801880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:20:13.802089 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:13.801941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" event={"ID":"85072088-8af2-4219-80f7-6a18460c13cf","Type":"ContainerStarted","Data":"c8dd9b570211b2a6f1d4b8dd63a42eafaa9c1809fef1704ca0f7e307396db6a3"} Apr 17 15:20:13.802266 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:13.802243 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:20:13.806944 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:13.806921 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" Apr 17 15:20:13.820577 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:13.820539 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fdxgz" podStartSLOduration=54.184407511 podStartE2EDuration="56.820528385s" podCreationTimestamp="2026-04-17 15:19:17 +0000 UTC" firstStartedPulling="2026-04-17 15:19:17.634631986 +0000 UTC m=+122.176296831" lastFinishedPulling="2026-04-17 15:19:20.270752846 +0000 UTC m=+124.812417705" observedRunningTime="2026-04-17 15:20:13.818544853 +0000 UTC m=+178.360209719" watchObservedRunningTime="2026-04-17 15:20:13.820528385 +0000 UTC m=+178.362193248" Apr 17 15:20:17.045521 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:17.045487 2577 patch_prober.go:28] interesting pod/image-registry-664f745c79-lz8j5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 15:20:17.045900 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:17.045535 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" podUID="3b884775-56c3-4983-a77d-ac6ef0438ae4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 15:20:18.676151 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:18.676123 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-664f745c79-lz8j5" Apr 17 15:20:19.159323 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:19.159285 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:20:19.159323 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:19.159332 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:20:34.804163 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:34.804128 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjkbk_c4b76af8-7aae-4de0-be95-221109f82fb9/kube-state-metrics/0.log" Apr 17 15:20:35.004440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:35.004405 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjkbk_c4b76af8-7aae-4de0-be95-221109f82fb9/kube-rbac-proxy-main/0.log" Apr 17 15:20:35.203812 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:35.203784 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjkbk_c4b76af8-7aae-4de0-be95-221109f82fb9/kube-rbac-proxy-self/0.log" Apr 17 15:20:35.407102 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:35.407069 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7f6d6b796-8d99r_eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8/metrics-server/0.log" Apr 17 15:20:37.003936 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:37.003908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vb2bb_0210dbd3-bf15-4240-a299-f959cf307c04/init-textfile/0.log" Apr 17 15:20:37.204157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:37.204132 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vb2bb_0210dbd3-bf15-4240-a299-f959cf307c04/node-exporter/0.log" Apr 17 15:20:37.403766 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:37.403740 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vb2bb_0210dbd3-bf15-4240-a299-f959cf307c04/kube-rbac-proxy/0.log" Apr 17 15:20:38.204460 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:38.204434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_03428d8d-f204-45cf-ab61-4a66cca8d24b/init-config-reloader/0.log" Apr 17 15:20:38.405816 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:38.405788 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_03428d8d-f204-45cf-ab61-4a66cca8d24b/prometheus/0.log" Apr 17 15:20:38.604071 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:38.604029 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_03428d8d-f204-45cf-ab61-4a66cca8d24b/config-reloader/0.log" Apr 17 15:20:38.804068 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:38.804021 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_03428d8d-f204-45cf-ab61-4a66cca8d24b/thanos-sidecar/0.log" Apr 17 15:20:39.005994 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:39.005934 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_03428d8d-f204-45cf-ab61-4a66cca8d24b/kube-rbac-proxy-web/0.log" Apr 17 15:20:39.164747 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:39.164727 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:20:39.168524 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:39.168497 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f6d6b796-8d99r" Apr 17 15:20:39.205812 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:39.205794 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_03428d8d-f204-45cf-ab61-4a66cca8d24b/kube-rbac-proxy/0.log" Apr 17 15:20:39.403808 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:39.403786 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_03428d8d-f204-45cf-ab61-4a66cca8d24b/kube-rbac-proxy-thanos/0.log" Apr 17 15:20:40.004053 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:40.004008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-t5wmn_93545302-4bda-45c5-9cb7-2e69c294a279/prometheus-operator-admission-webhook/0.log" Apr 17 15:20:40.204190 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:40.204163 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85f567b6d8-trkdw_f29adb65-f616-46d7-9d7e-90a6185b6533/telemeter-client/0.log" Apr 17 15:20:40.404274 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:40.404253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85f567b6d8-trkdw_f29adb65-f616-46d7-9d7e-90a6185b6533/reload/0.log" Apr 17 15:20:40.610317 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:40.610290 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85f567b6d8-trkdw_f29adb65-f616-46d7-9d7e-90a6185b6533/kube-rbac-proxy/0.log" Apr 17 15:20:40.804236 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:40.804161 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/thanos-query/0.log" Apr 17 15:20:41.003885 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:41.003863 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy-web/0.log" Apr 17 15:20:41.204887 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:41.204859 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy/0.log" Apr 17 15:20:41.407431 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:41.407406 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/prom-label-proxy/0.log" Apr 17 15:20:41.604333 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:41.604313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy-rules/0.log" Apr 17 15:20:41.803796 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:41.803775 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy-metrics/0.log" Apr 17 15:20:42.203624 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:42.203596 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:20:42.405266 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:42.405240 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/3.log" Apr 17 15:20:51.913942 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:51.913909 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b6f9506-66fe-4d0b-b26e-7d16fbe56762" containerID="f31cc636c03b26601a2ce3bfa005bf51b579fe7339406896517428eaae2b3e9c" exitCode=0 Apr 17 15:20:51.914324 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:51.913965 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" event={"ID":"1b6f9506-66fe-4d0b-b26e-7d16fbe56762","Type":"ContainerDied","Data":"f31cc636c03b26601a2ce3bfa005bf51b579fe7339406896517428eaae2b3e9c"} Apr 17 15:20:51.914367 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:51.914328 2577 scope.go:117] "RemoveContainer" containerID="f31cc636c03b26601a2ce3bfa005bf51b579fe7339406896517428eaae2b3e9c" Apr 17 15:20:52.918673 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:20:52.918640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-brnqs" event={"ID":"1b6f9506-66fe-4d0b-b26e-7d16fbe56762","Type":"ContainerStarted","Data":"5c365fbde07d10604863dbc2093dc81b7d2dc3cf711badcd6ff21942322aa0d3"} Apr 17 15:21:00.990898 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:00.990865 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:01.013447 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:01.013414 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:01.964937 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:01.964913 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:18.974971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:18.974931 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:21:18.975489 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:18.975377 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="prometheus" containerID="cri-o://b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" gracePeriod=600 Apr 17 15:21:18.975489 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:18.975431 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-thanos" containerID="cri-o://2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" gracePeriod=600 Apr 17 15:21:18.975489 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:18.975443 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="thanos-sidecar" containerID="cri-o://0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" gracePeriod=600 Apr 17 15:21:18.975658 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:18.975459 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-web" containerID="cri-o://104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" gracePeriod=600 Apr 17 15:21:18.975658 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:18.975432 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy" containerID="cri-o://310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" gracePeriod=600 Apr 17 15:21:18.975658 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:18.975466 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="config-reloader" containerID="cri-o://efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" gracePeriod=600 Apr 17 15:21:19.214183 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.214161 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:19.347230 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347194 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-serving-certs-ca-bundle\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347230 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347245 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-kubelet-serving-ca-bundle\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347283 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-config\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347316 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-db\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347339 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347362 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-thanos-prometheus-http-client-file\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347386 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-metrics-client-certs\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347416 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-kube-rbac-proxy\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347447 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347474 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-metrics-client-ca\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347505 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-trusted-ca-bundle\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347534 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-tls\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347565 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-grpc-tls\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347612 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcbr9\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-kube-api-access-lcbr9\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347635 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-config-out\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347654 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:19.347717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347670 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-rulefiles-0\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.348380 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347723 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-tls-assets\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.348380 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347807 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-web-config\") pod \"03428d8d-f204-45cf-ab61-4a66cca8d24b\" (UID: \"03428d8d-f204-45cf-ab61-4a66cca8d24b\") " Apr 17 15:21:19.348380 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.347971 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:19.348380 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.348178 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.348380 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.348200 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-metrics-client-ca\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.348380 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.348337 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:19.349132 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.349104 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:21:19.349374 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.349344 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:19.350113 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.350086 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:19.350884 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.350860 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:21:19.351097 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.351076 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.351650 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.351616 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.351886 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.351860 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.352061 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.352008 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.352061 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.352020 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-config" (OuterVolumeSpecName: "config") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.352425 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.352406 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.352760 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.352736 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-kube-api-access-lcbr9" (OuterVolumeSpecName: "kube-api-access-lcbr9") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "kube-api-access-lcbr9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:21:19.353072 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.353019 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.353448 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.353428 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.353528 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.353471 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-config-out" (OuterVolumeSpecName: "config-out") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:21:19.362749 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.362727 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-web-config" (OuterVolumeSpecName: "web-config") pod "03428d8d-f204-45cf-ab61-4a66cca8d24b" (UID: "03428d8d-f204-45cf-ab61-4a66cca8d24b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:19.449395 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449371 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-db\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449395 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449392 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449402 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449412 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-metrics-client-certs\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449420 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-kube-rbac-proxy\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449429 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449438 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449446 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449455 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-secret-grpc-tls\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449463 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lcbr9\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-kube-api-access-lcbr9\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449472 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03428d8d-f204-45cf-ab61-4a66cca8d24b-config-out\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449480 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449488 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03428d8d-f204-45cf-ab61-4a66cca8d24b-tls-assets\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449496 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-web-config\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449505 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03428d8d-f204-45cf-ab61-4a66cca8d24b-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:19.449510 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:19.449513 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/03428d8d-f204-45cf-ab61-4a66cca8d24b-config\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:21:20.013906 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013874 2577 generic.go:358] "Generic (PLEG): container finished" podID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" exitCode=0 Apr 17 15:21:20.013906 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013898 2577 generic.go:358] "Generic (PLEG): container finished" podID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" exitCode=0 Apr 17 15:21:20.013906 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013907 2577 generic.go:358] "Generic (PLEG): container finished" podID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" exitCode=0 Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013913 2577 generic.go:358] "Generic (PLEG): container finished" podID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" exitCode=0 Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013918 2577 generic.go:358] "Generic (PLEG): container finished" podID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" exitCode=0 Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013924 2577 generic.go:358] "Generic (PLEG): container finished" podID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" exitCode=0 Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013982 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.013999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.014016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.014050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.014065 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.014076 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.014089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"03428d8d-f204-45cf-ab61-4a66cca8d24b","Type":"ContainerDied","Data":"4c336103e338680e4ab3eb734aa681a4f460a4ade974fcff4c40a814c2ff8133"} Apr 17 15:21:20.014304 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.014073 2577 scope.go:117] "RemoveContainer" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.023190 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.023173 2577 scope.go:117] "RemoveContainer" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.030183 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.030162 2577 scope.go:117] "RemoveContainer" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.037358 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.037279 2577 scope.go:117] "RemoveContainer" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.037666 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.037645 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:21:20.042871 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.042848 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:21:20.044703 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.044689 2577 scope.go:117] "RemoveContainer" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.051187 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.051169 2577 scope.go:117] "RemoveContainer" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.058300 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.058283 2577 scope.go:117] "RemoveContainer" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.061842 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.061821 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" path="/var/lib/kubelet/pods/03428d8d-f204-45cf-ab61-4a66cca8d24b/volumes" Apr 17 15:21:20.065094 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065073 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:21:20.065386 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065357 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="prometheus" Apr 17 15:21:20.065386 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065371 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="prometheus" Apr 17 15:21:20.065386 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065382 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="init-config-reloader" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065388 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="init-config-reloader" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065403 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-thanos" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065408 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-thanos" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065418 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-web" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065423 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-web" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065430 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="thanos-sidecar" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065440 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="thanos-sidecar" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065450 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065450 2577 scope.go:117] "RemoveContainer" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065457 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065512 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="config-reloader" Apr 17 15:21:20.065533 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065520 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="config-reloader" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065601 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065612 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="thanos-sidecar" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065621 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="config-reloader" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065632 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="prometheus" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065641 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-thanos" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065653 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="03428d8d-f204-45cf-ab61-4a66cca8d24b" containerName="kube-rbac-proxy-web" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:21:20.065696 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": container with ID starting with 2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b not found: ID does not exist" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065718 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} err="failed to get container status \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": rpc error: code = NotFound desc = could not find container \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": container with ID starting with 2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b not found: ID does not exist" Apr 17 15:21:20.065981 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.065750 2577 scope.go:117] "RemoveContainer" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.066364 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:21:20.066057 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": container with ID starting with 310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306 not found: ID does not exist" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.066364 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066085 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} err="failed to get container status \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": rpc error: code = NotFound desc = could not find container \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": container with ID starting with 310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306 not found: ID does not exist" Apr 17 15:21:20.066364 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066101 2577 scope.go:117] "RemoveContainer" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.066512 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:21:20.066389 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": container with ID starting with 104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79 not found: ID does not exist" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.066512 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066419 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} err="failed to get container status \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": rpc error: code = NotFound desc = could not find container \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": container with ID starting with 104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79 not found: ID does not exist" Apr 17 15:21:20.066512 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066446 2577 scope.go:117] "RemoveContainer" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.066676 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:21:20.066658 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": container with ID starting with 0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b not found: ID does not exist" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.066721 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066682 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} err="failed to get container status \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": rpc error: code = NotFound desc = could not find container \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": container with ID starting with 0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b not found: ID does not exist" Apr 17 15:21:20.066721 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066697 2577 scope.go:117] "RemoveContainer" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.066909 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:21:20.066894 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": container with ID starting with efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625 not found: ID does not exist" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.066952 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066912 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} err="failed to get container status \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": rpc error: code = NotFound desc = could not find container \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": container with ID starting with efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625 not found: ID does not exist" Apr 17 15:21:20.066952 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.066926 2577 scope.go:117] "RemoveContainer" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.067173 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:21:20.067157 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": container with ID starting with b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964 not found: ID does not exist" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.067221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067177 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} err="failed to get container status \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": rpc error: code = NotFound desc = could not find container \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": container with ID starting with b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964 not found: ID does not exist" Apr 17 15:21:20.067221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067191 2577 scope.go:117] "RemoveContainer" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.067459 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:21:20.067444 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": container with ID starting with bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815 not found: ID does not exist" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.067518 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067461 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815"} err="failed to get container status \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": rpc error: code = NotFound desc = could not find container \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": container with ID starting with bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815 not found: ID does not exist" Apr 17 15:21:20.067518 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067473 2577 scope.go:117] "RemoveContainer" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.067675 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067657 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} err="failed to get container status \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": rpc error: code = NotFound desc = could not find container \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": container with ID starting with 2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b not found: ID does not exist" Apr 17 15:21:20.067716 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067677 2577 scope.go:117] "RemoveContainer" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.067899 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067881 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} err="failed to get container status \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": rpc error: code = NotFound desc = could not find container \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": container with ID starting with 310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306 not found: ID does not exist" Apr 17 15:21:20.067941 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.067899 2577 scope.go:117] "RemoveContainer" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.068157 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068135 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} err="failed to get container status \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": rpc error: code = NotFound desc = could not find container \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": container with ID starting with 104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79 not found: ID does not exist" Apr 17 15:21:20.068224 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068159 2577 scope.go:117] "RemoveContainer" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.068421 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068393 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} err="failed to get container status \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": rpc error: code = NotFound desc = could not find container \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": container with ID starting with 0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b not found: ID does not exist" Apr 17 15:21:20.068467 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068423 2577 scope.go:117] "RemoveContainer" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.068645 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068627 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} err="failed to get container status \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": rpc error: code = NotFound desc = could not find container \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": container with ID starting with efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625 not found: ID does not exist" Apr 17 15:21:20.068692 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068645 2577 scope.go:117] "RemoveContainer" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.068846 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068828 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} err="failed to get container status \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": rpc error: code = NotFound desc = could not find container \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": container with ID starting with b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964 not found: ID does not exist" Apr 17 15:21:20.068887 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.068847 2577 scope.go:117] "RemoveContainer" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.069096 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069077 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815"} err="failed to get container status \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": rpc error: code = NotFound desc = could not find container \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": container with ID starting with bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815 not found: ID does not exist" Apr 17 15:21:20.069161 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069098 2577 scope.go:117] "RemoveContainer" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.069340 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069323 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} err="failed to get container status \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": rpc error: code = NotFound desc = could not find container \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": container with ID starting with 2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b not found: ID does not exist" Apr 17 15:21:20.069387 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069341 2577 scope.go:117] "RemoveContainer" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.069564 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069544 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} err="failed to get container status \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": rpc error: code = NotFound desc = could not find container \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": container with ID starting with 310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306 not found: ID does not exist" Apr 17 15:21:20.069604 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069564 2577 scope.go:117] "RemoveContainer" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.069732 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069717 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} err="failed to get container status \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": rpc error: code = NotFound desc = could not find container \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": container with ID starting with 104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79 not found: ID does not exist" Apr 17 15:21:20.069793 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069732 2577 scope.go:117] "RemoveContainer" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.069971 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069953 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} err="failed to get container status \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": rpc error: code = NotFound desc = could not find container \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": container with ID starting with 0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b not found: ID does not exist" Apr 17 15:21:20.070028 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.069971 2577 scope.go:117] "RemoveContainer" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.070220 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070203 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} err="failed to get container status \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": rpc error: code = NotFound desc = could not find container \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": container with ID starting with efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625 not found: ID does not exist" Apr 17 15:21:20.070260 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070221 2577 scope.go:117] "RemoveContainer" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.070456 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070440 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} err="failed to get container status \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": rpc error: code = NotFound desc = could not find container \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": container with ID starting with b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964 not found: ID does not exist" Apr 17 15:21:20.070508 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070457 2577 scope.go:117] "RemoveContainer" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.070683 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070665 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815"} err="failed to get container status \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": rpc error: code = NotFound desc = could not find container \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": container with ID starting with bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815 not found: ID does not exist" Apr 17 15:21:20.070683 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070680 2577 scope.go:117] "RemoveContainer" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.070891 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070875 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} err="failed to get container status \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": rpc error: code = NotFound desc = could not find container \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": container with ID starting with 2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b not found: ID does not exist" Apr 17 15:21:20.070936 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.070893 2577 scope.go:117] "RemoveContainer" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.071119 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071095 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} err="failed to get container status \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": rpc error: code = NotFound desc = could not find container \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": container with ID starting with 310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306 not found: ID does not exist" Apr 17 15:21:20.071201 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071119 2577 scope.go:117] "RemoveContainer" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.071279 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071262 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.071335 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071319 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} err="failed to get container status \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": rpc error: code = NotFound desc = could not find container \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": container with ID starting with 104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79 not found: ID does not exist" Apr 17 15:21:20.071383 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071336 2577 scope.go:117] "RemoveContainer" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.071603 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071578 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} err="failed to get container status \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": rpc error: code = NotFound desc = could not find container \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": container with ID starting with 0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b not found: ID does not exist" Apr 17 15:21:20.071685 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071605 2577 scope.go:117] "RemoveContainer" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.071867 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071846 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} err="failed to get container status \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": rpc error: code = NotFound desc = could not find container \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": container with ID starting with efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625 not found: ID does not exist" Apr 17 15:21:20.071955 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.071868 2577 scope.go:117] "RemoveContainer" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.072186 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072159 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} err="failed to get container status \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": rpc error: code = NotFound desc = could not find container \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": container with ID starting with b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964 not found: ID does not exist" Apr 17 15:21:20.072186 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072180 2577 scope.go:117] "RemoveContainer" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.072412 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072392 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815"} err="failed to get container status \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": rpc error: code = NotFound desc = could not find container \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": container with ID starting with bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815 not found: ID does not exist" Apr 17 15:21:20.072412 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072411 2577 scope.go:117] "RemoveContainer" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.072668 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072648 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} err="failed to get container status \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": rpc error: code = NotFound desc = could not find container \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": container with ID starting with 2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b not found: ID does not exist" Apr 17 15:21:20.072668 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072668 2577 scope.go:117] "RemoveContainer" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.072926 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072910 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} err="failed to get container status \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": rpc error: code = NotFound desc = could not find container \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": container with ID starting with 310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306 not found: ID does not exist" Apr 17 15:21:20.073001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.072928 2577 scope.go:117] "RemoveContainer" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.073169 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073149 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} err="failed to get container status \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": rpc error: code = NotFound desc = could not find container \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": container with ID starting with 104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79 not found: ID does not exist" Apr 17 15:21:20.073229 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073170 2577 scope.go:117] "RemoveContainer" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.073427 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073404 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} err="failed to get container status \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": rpc error: code = NotFound desc = could not find container \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": container with ID starting with 0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b not found: ID does not exist" Apr 17 15:21:20.073427 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073426 2577 scope.go:117] "RemoveContainer" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.073676 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073572 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 15:21:20.073676 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073651 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 15:21:20.073816 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073695 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 15:21:20.073816 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073709 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} err="failed to get container status \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": rpc error: code = NotFound desc = could not find container \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": container with ID starting with efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625 not found: ID does not exist" Apr 17 15:21:20.073816 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073743 2577 scope.go:117] "RemoveContainer" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.073816 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.073653 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 15:21:20.074122 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074087 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} err="failed to get container status \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": rpc error: code = NotFound desc = could not find container \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": container with ID starting with b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964 not found: ID does not exist" Apr 17 15:21:20.074122 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074116 2577 scope.go:117] "RemoveContainer" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.074272 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074207 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-957poubn82coc\"" Apr 17 15:21:20.074272 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074219 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 15:21:20.074377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074326 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8cwzt\"" Apr 17 15:21:20.074377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074333 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 15:21:20.074377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074350 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815"} err="failed to get container status \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": rpc error: code = NotFound desc = could not find container \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": container with ID starting with bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815 not found: ID does not exist" Apr 17 15:21:20.074377 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074370 2577 scope.go:117] "RemoveContainer" containerID="2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b" Apr 17 15:21:20.074578 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074389 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 15:21:20.074578 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 15:21:20.074578 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074496 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 15:21:20.074760 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074742 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 15:21:20.074874 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074849 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b"} err="failed to get container status \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": rpc error: code = NotFound desc = could not find container \"2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b\": container with ID starting with 2a5a38f5a657383a1f508c26b313bfed9546fbb14dbeb7af84cb4042f42c461b not found: ID does not exist" Apr 17 15:21:20.074926 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.074877 2577 scope.go:117] "RemoveContainer" containerID="310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306" Apr 17 15:21:20.075236 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.075192 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306"} err="failed to get container status \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": rpc error: code = NotFound desc = could not find container \"310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306\": container with ID starting with 310d0f4864c2d3eb59e91a51aeb574129cd63668aff0f7a799de8541a6698306 not found: ID does not exist" Apr 17 15:21:20.075311 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.075238 2577 scope.go:117] "RemoveContainer" containerID="104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79" Apr 17 15:21:20.075521 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.075501 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79"} err="failed to get container status \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": rpc error: code = NotFound desc = could not find container \"104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79\": container with ID starting with 104ca6f22c328820f09db456d009d1add6a59006ec70e191970fa513f03e8d79 not found: ID does not exist" Apr 17 15:21:20.075594 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.075522 2577 scope.go:117] "RemoveContainer" containerID="0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b" Apr 17 15:21:20.075861 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.075836 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b"} err="failed to get container status \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": rpc error: code = NotFound desc = could not find container \"0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b\": container with ID starting with 0af008da12d9a8b7e200f9e76bfc38245c7d013166d7bb84a3a40f52da5e2a3b not found: ID does not exist" Apr 17 15:21:20.075861 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.075860 2577 scope.go:117] "RemoveContainer" containerID="efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625" Apr 17 15:21:20.076142 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.076116 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625"} err="failed to get container status \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": rpc error: code = NotFound desc = could not find container \"efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625\": container with ID starting with efc4adf40202da1c6c84e147dce922248cf805cea3ea95d9729ff71924977625 not found: ID does not exist" Apr 17 15:21:20.076228 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.076143 2577 scope.go:117] "RemoveContainer" containerID="b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964" Apr 17 15:21:20.085712 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.084762 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964"} err="failed to get container status \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": rpc error: code = NotFound desc = could not find container \"b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964\": container with ID starting with b99ae0c68584750f1ae7a9e6872b78dc538f3579ed6c79ef07162abe0c4fe964 not found: ID does not exist" Apr 17 15:21:20.085832 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.085718 2577 scope.go:117] "RemoveContainer" containerID="bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815" Apr 17 15:21:20.086249 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.086225 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 15:21:20.086348 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.086249 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:21:20.086445 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.086417 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815"} err="failed to get container status \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": rpc error: code = NotFound desc = could not find container \"bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815\": container with ID starting with bc9b5ab5107eab2e24ff48253ba848bebf2d1bb335f0a3fd797c94b67e16d815 not found: ID does not exist" Apr 17 15:21:20.088244 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.088221 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 15:21:20.155364 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155336 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8258b17e-3459-46de-aaac-5f443f4b05c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155494 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155494 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155494 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155669 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155669 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155669 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155669 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155669 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155669 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155980 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155715 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-config\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155980 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155980 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5zdm\" (UniqueName: \"kubernetes.io/projected/8258b17e-3459-46de-aaac-5f443f4b05c0-kube-api-access-m5zdm\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155980 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155980 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155980 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.155980 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.155979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8258b17e-3459-46de-aaac-5f443f4b05c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.156248 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.156057 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.256724 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.256724 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.256724 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.256724 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-config\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5zdm\" (UniqueName: \"kubernetes.io/projected/8258b17e-3459-46de-aaac-5f443f4b05c0-kube-api-access-m5zdm\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.256988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257076 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8258b17e-3459-46de-aaac-5f443f4b05c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257561 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257561 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8258b17e-3459-46de-aaac-5f443f4b05c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257561 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257561 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257922 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.257922 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.258027 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.257988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.258689 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.258309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.258689 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.258658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.260442 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.260414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.260551 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.260447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.260927 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.260902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8258b17e-3459-46de-aaac-5f443f4b05c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.261017 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.260905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.261100 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.261023 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.261257 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.261233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8258b17e-3459-46de-aaac-5f443f4b05c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.262248 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.262221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-config\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.262473 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.262418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.263202 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.263181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.263448 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.263424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8258b17e-3459-46de-aaac-5f443f4b05c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.263529 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.263461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.264112 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.264091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8258b17e-3459-46de-aaac-5f443f4b05c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.267189 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.267171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5zdm\" (UniqueName: \"kubernetes.io/projected/8258b17e-3459-46de-aaac-5f443f4b05c0-kube-api-access-m5zdm\") pod \"prometheus-k8s-0\" (UID: \"8258b17e-3459-46de-aaac-5f443f4b05c0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.384502 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.384460 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:20.511578 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:20.511502 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 15:21:20.514673 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:21:20.514648 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8258b17e_3459_46de_aaac_5f443f4b05c0.slice/crio-1c250cf0abba7808a91bec681e77485d72dab5e5a9750d8cabe462ad15ccae77 WatchSource:0}: Error finding container 1c250cf0abba7808a91bec681e77485d72dab5e5a9750d8cabe462ad15ccae77: Status 404 returned error can't find the container with id 1c250cf0abba7808a91bec681e77485d72dab5e5a9750d8cabe462ad15ccae77 Apr 17 15:21:21.019138 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:21.019105 2577 generic.go:358] "Generic (PLEG): container finished" podID="8258b17e-3459-46de-aaac-5f443f4b05c0" containerID="cf322d934c414c64303805d917ea3b8c7e551df5de9209a7b9e7547471333131" exitCode=0 Apr 17 15:21:21.019490 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:21.019143 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerDied","Data":"cf322d934c414c64303805d917ea3b8c7e551df5de9209a7b9e7547471333131"} Apr 17 15:21:21.019490 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:21.019163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerStarted","Data":"1c250cf0abba7808a91bec681e77485d72dab5e5a9750d8cabe462ad15ccae77"} Apr 17 15:21:22.027512 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:22.027471 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerStarted","Data":"d120b827f8a7a0a3b5e6a152371f47dc6cdbdf59d5bcb1a7485647fa1bdb1114"} Apr 17 15:21:22.027512 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:22.027516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerStarted","Data":"7aafd302c09d0d419cd7b1f8cdb445f08344dc82f989b18b62d7f0780a6230b2"} Apr 17 15:21:22.027942 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:22.027530 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerStarted","Data":"41fdd1801aaa7e19938ee342a723e0b57461af74266fceac9673d5edb3b24d2c"} Apr 17 15:21:22.027942 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:22.027543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerStarted","Data":"27a54edd2b484ef7f90903ce58f1f4e5cebb001349d65b8be4e2d2ee2d040304"} Apr 17 15:21:22.027942 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:22.027555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerStarted","Data":"b3f5bb6fd509e3c45d6dd610e429803aa44eab9b71d4edc8b6dbcb36b202d4a3"} Apr 17 15:21:22.027942 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:22.027567 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8258b17e-3459-46de-aaac-5f443f4b05c0","Type":"ContainerStarted","Data":"081bb3bd8bf746104bdaae2ea3cfaf9b3fa052c18eba0554ddfb5e97ecc23140"} Apr 17 15:21:22.055308 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:22.055251 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.05523368 podStartE2EDuration="2.05523368s" podCreationTimestamp="2026-04-17 15:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:21:22.052981552 +0000 UTC m=+246.594646421" watchObservedRunningTime="2026-04-17 15:21:22.05523368 +0000 UTC m=+246.596898546" Apr 17 15:21:25.385544 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:25.385486 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:21:26.818596 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:26.818548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:21:26.821128 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:26.821085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4445020e-d73c-4a2d-9f40-1c3fc286490e-metrics-certs\") pod \"network-metrics-daemon-j7zl6\" (UID: \"4445020e-d73c-4a2d-9f40-1c3fc286490e\") " pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:21:26.860508 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:26.860473 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9m884\"" Apr 17 15:21:26.868496 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:26.868463 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j7zl6" Apr 17 15:21:27.017397 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:27.017361 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j7zl6"] Apr 17 15:21:27.020166 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:21:27.020136 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4445020e_d73c_4a2d_9f40_1c3fc286490e.slice/crio-3b839d72273be007f2870300fc16b8a73020dfe9cb28d5a475edbcad23ee37e4 WatchSource:0}: Error finding container 3b839d72273be007f2870300fc16b8a73020dfe9cb28d5a475edbcad23ee37e4: Status 404 returned error can't find the container with id 3b839d72273be007f2870300fc16b8a73020dfe9cb28d5a475edbcad23ee37e4 Apr 17 15:21:27.048837 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:27.048802 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j7zl6" event={"ID":"4445020e-d73c-4a2d-9f40-1c3fc286490e","Type":"ContainerStarted","Data":"3b839d72273be007f2870300fc16b8a73020dfe9cb28d5a475edbcad23ee37e4"} Apr 17 15:21:29.057587 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:29.057552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j7zl6" event={"ID":"4445020e-d73c-4a2d-9f40-1c3fc286490e","Type":"ContainerStarted","Data":"0f236ae965c47867801f60a2a58a5e745490e8d7d6c27c68d60d7fd9791e1958"} Apr 17 15:21:29.057587 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:29.057591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j7zl6" event={"ID":"4445020e-d73c-4a2d-9f40-1c3fc286490e","Type":"ContainerStarted","Data":"531edcae774c70828e3b8003e19d7a0cfbd281d3b212cd0f0b27fcc08eb96c3b"} Apr 17 15:21:29.073882 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:21:29.073830 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j7zl6" podStartSLOduration=251.793384449 podStartE2EDuration="4m13.073814749s" podCreationTimestamp="2026-04-17 15:17:16 +0000 UTC" firstStartedPulling="2026-04-17 15:21:27.021834138 +0000 UTC m=+251.563498983" lastFinishedPulling="2026-04-17 15:21:28.302264433 +0000 UTC m=+252.843929283" observedRunningTime="2026-04-17 15:21:29.071646944 +0000 UTC m=+253.613311810" watchObservedRunningTime="2026-04-17 15:21:29.073814749 +0000 UTC m=+253.615479616" Apr 17 15:22:15.932286 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:15.932267 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:22:15.935349 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:15.935324 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:22:15.938927 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:15.938906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:22:15.941894 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:15.941876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:22:15.942653 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:15.942635 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 15:22:20.385560 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:20.385526 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:22:20.403252 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:20.403229 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:22:21.230784 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:22:21.230756 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 15:23:30.899717 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.899634 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg"] Apr 17 15:23:30.903283 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.903262 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:30.907026 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.906841 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 15:23:30.907026 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.906874 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 15:23:30.907026 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.906888 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 15:23:30.907026 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.906893 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 15:23:30.907026 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.906874 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kvnp8\"" Apr 17 15:23:30.918605 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:30.918582 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg"] Apr 17 15:23:31.089092 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.089066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79a0d695-e84d-4423-ba3a-7376aa132e46-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.089197 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.089116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79a0d695-e84d-4423-ba3a-7376aa132e46-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.089197 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.089172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn4gg\" (UniqueName: \"kubernetes.io/projected/79a0d695-e84d-4423-ba3a-7376aa132e46-kube-api-access-wn4gg\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.189644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.189589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79a0d695-e84d-4423-ba3a-7376aa132e46-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.189644 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.189635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn4gg\" (UniqueName: \"kubernetes.io/projected/79a0d695-e84d-4423-ba3a-7376aa132e46-kube-api-access-wn4gg\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.189817 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.189665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79a0d695-e84d-4423-ba3a-7376aa132e46-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.192170 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.192144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79a0d695-e84d-4423-ba3a-7376aa132e46-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.192262 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.192177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79a0d695-e84d-4423-ba3a-7376aa132e46-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.200356 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.200331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn4gg\" (UniqueName: \"kubernetes.io/projected/79a0d695-e84d-4423-ba3a-7376aa132e46-kube-api-access-wn4gg\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-c6gpg\" (UID: \"79a0d695-e84d-4423-ba3a-7376aa132e46\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.213152 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.213134 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:31.338922 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.338888 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg"] Apr 17 15:23:31.342525 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:23:31.342491 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79a0d695_e84d_4423_ba3a_7376aa132e46.slice/crio-e14d19370b40c67e032a412ab3ff94c86f0cb7f92b8e61ccb3f514b783af9b42 WatchSource:0}: Error finding container e14d19370b40c67e032a412ab3ff94c86f0cb7f92b8e61ccb3f514b783af9b42: Status 404 returned error can't find the container with id e14d19370b40c67e032a412ab3ff94c86f0cb7f92b8e61ccb3f514b783af9b42 Apr 17 15:23:31.344199 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.344181 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:23:31.418683 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:31.418654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" event={"ID":"79a0d695-e84d-4423-ba3a-7376aa132e46","Type":"ContainerStarted","Data":"e14d19370b40c67e032a412ab3ff94c86f0cb7f92b8e61ccb3f514b783af9b42"} Apr 17 15:23:34.301362 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.301327 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz"] Apr 17 15:23:34.304470 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.304443 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.306920 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.306884 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:23:34.308004 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.307982 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 15:23:34.308139 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.308100 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 15:23:34.308139 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.308112 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 15:23:34.308245 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.308232 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 15:23:34.308333 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.308320 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lf4nl\"" Apr 17 15:23:34.311788 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.311769 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz"] Apr 17 15:23:34.415215 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.415188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/613a897c-2655-4e38-a210-08757b6ac34a-metrics-cert\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.415340 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.415258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/613a897c-2655-4e38-a210-08757b6ac34a-cert\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.415340 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.415301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/613a897c-2655-4e38-a210-08757b6ac34a-manager-config\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.415457 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.415339 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k7s2\" (UniqueName: \"kubernetes.io/projected/613a897c-2655-4e38-a210-08757b6ac34a-kube-api-access-5k7s2\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.430826 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.430800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" event={"ID":"79a0d695-e84d-4423-ba3a-7376aa132e46","Type":"ContainerStarted","Data":"646a572120deb0bbf02a40df4bf34fee8fce2d96673bd4a5a2f977d76404e773"} Apr 17 15:23:34.430929 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.430867 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:34.456252 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.456205 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" podStartSLOduration=1.467955305 podStartE2EDuration="4.456190774s" podCreationTimestamp="2026-04-17 15:23:30 +0000 UTC" firstStartedPulling="2026-04-17 15:23:31.344363908 +0000 UTC m=+375.886028757" lastFinishedPulling="2026-04-17 15:23:34.33259938 +0000 UTC m=+378.874264226" observedRunningTime="2026-04-17 15:23:34.453647448 +0000 UTC m=+378.995312315" watchObservedRunningTime="2026-04-17 15:23:34.456190774 +0000 UTC m=+378.997855638" Apr 17 15:23:34.516660 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.516632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k7s2\" (UniqueName: \"kubernetes.io/projected/613a897c-2655-4e38-a210-08757b6ac34a-kube-api-access-5k7s2\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.516754 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.516701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/613a897c-2655-4e38-a210-08757b6ac34a-metrics-cert\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.516814 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.516751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/613a897c-2655-4e38-a210-08757b6ac34a-cert\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.516895 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.516875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/613a897c-2655-4e38-a210-08757b6ac34a-manager-config\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.517484 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.517467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/613a897c-2655-4e38-a210-08757b6ac34a-manager-config\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.519228 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.519212 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/613a897c-2655-4e38-a210-08757b6ac34a-metrics-cert\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.519311 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.519278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/613a897c-2655-4e38-a210-08757b6ac34a-cert\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.524063 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.524026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k7s2\" (UniqueName: \"kubernetes.io/projected/613a897c-2655-4e38-a210-08757b6ac34a-kube-api-access-5k7s2\") pod \"lws-controller-manager-59bc47b496-cdqvz\" (UID: \"613a897c-2655-4e38-a210-08757b6ac34a\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.615273 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.615249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:34.732785 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:34.732754 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz"] Apr 17 15:23:34.735509 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:23:34.735486 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613a897c_2655_4e38_a210_08757b6ac34a.slice/crio-d015e9f3515421b6e1b7dfc49a9b16978936ca25fa5878bb415b62ffc44e1a20 WatchSource:0}: Error finding container d015e9f3515421b6e1b7dfc49a9b16978936ca25fa5878bb415b62ffc44e1a20: Status 404 returned error can't find the container with id d015e9f3515421b6e1b7dfc49a9b16978936ca25fa5878bb415b62ffc44e1a20 Apr 17 15:23:35.435025 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:35.434987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" event={"ID":"613a897c-2655-4e38-a210-08757b6ac34a","Type":"ContainerStarted","Data":"d015e9f3515421b6e1b7dfc49a9b16978936ca25fa5878bb415b62ffc44e1a20"} Apr 17 15:23:38.446978 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:38.446943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" event={"ID":"613a897c-2655-4e38-a210-08757b6ac34a","Type":"ContainerStarted","Data":"24f180024b1ccd922c356e7073e8a424964619a3f2543ac7240c66cc4270f504"} Apr 17 15:23:38.447340 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:38.447182 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:38.463422 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:38.463371 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" podStartSLOduration=0.962509911 podStartE2EDuration="4.463357343s" podCreationTimestamp="2026-04-17 15:23:34 +0000 UTC" firstStartedPulling="2026-04-17 15:23:34.737198739 +0000 UTC m=+379.278863584" lastFinishedPulling="2026-04-17 15:23:38.238046151 +0000 UTC m=+382.779711016" observedRunningTime="2026-04-17 15:23:38.461998741 +0000 UTC m=+383.003663608" watchObservedRunningTime="2026-04-17 15:23:38.463357343 +0000 UTC m=+383.005022210" Apr 17 15:23:45.437412 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:45.437378 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-c6gpg" Apr 17 15:23:49.452659 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:49.452618 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-cdqvz" Apr 17 15:23:59.399907 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.399874 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8"] Apr 17 15:23:59.406405 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.406384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.408459 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.408439 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 15:23:59.408649 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.408635 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 15:23:59.408768 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.408733 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 15:23:59.409023 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.409003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 15:23:59.409565 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.409552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-c95dl\"" Apr 17 15:23:59.412569 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.412545 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8"] Apr 17 15:23:59.525519 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.525493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-tmp\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.525660 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.525525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-tls-certs\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.525660 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.525557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trhhr\" (UniqueName: \"kubernetes.io/projected/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-kube-api-access-trhhr\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.626597 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.626568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trhhr\" (UniqueName: \"kubernetes.io/projected/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-kube-api-access-trhhr\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.626731 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.626631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-tmp\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.626731 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.626650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-tls-certs\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.628960 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.628939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-tmp\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.629086 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.628982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-tls-certs\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.633739 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.633719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhhr\" (UniqueName: \"kubernetes.io/projected/dfd94ffa-c815-4fae-b34b-0e99fc19f2b8-kube-api-access-trhhr\") pod \"kube-auth-proxy-546dd5d8dc-2vbx8\" (UID: \"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8\") " pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.718119 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.718056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" Apr 17 15:23:59.836474 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:23:59.836431 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8"] Apr 17 15:23:59.839411 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:23:59.839385 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd94ffa_c815_4fae_b34b_0e99fc19f2b8.slice/crio-b5b5f7f139d62a3cd49d5d92fe38ab40d8d438a3f0b253db4841aefb31f54dd0 WatchSource:0}: Error finding container b5b5f7f139d62a3cd49d5d92fe38ab40d8d438a3f0b253db4841aefb31f54dd0: Status 404 returned error can't find the container with id b5b5f7f139d62a3cd49d5d92fe38ab40d8d438a3f0b253db4841aefb31f54dd0 Apr 17 15:24:00.522309 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:24:00.522270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" event={"ID":"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8","Type":"ContainerStarted","Data":"b5b5f7f139d62a3cd49d5d92fe38ab40d8d438a3f0b253db4841aefb31f54dd0"} Apr 17 15:24:04.537952 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:24:04.537911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" event={"ID":"dfd94ffa-c815-4fae-b34b-0e99fc19f2b8","Type":"ContainerStarted","Data":"b0aab93939fd44c8613d952ace6a73d0dd90908b9756a9e92cef6f9da0d3721c"} Apr 17 15:24:04.553487 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:24:04.553433 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-546dd5d8dc-2vbx8" podStartSLOduration=1.324440321 podStartE2EDuration="5.553417431s" podCreationTimestamp="2026-04-17 15:23:59 +0000 UTC" firstStartedPulling="2026-04-17 15:23:59.841652791 +0000 UTC m=+404.383317636" lastFinishedPulling="2026-04-17 15:24:04.070629888 +0000 UTC m=+408.612294746" observedRunningTime="2026-04-17 15:24:04.551778024 +0000 UTC m=+409.093442890" watchObservedRunningTime="2026-04-17 15:24:04.553417431 +0000 UTC m=+409.095082306" Apr 17 15:25:38.248220 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.248159 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx"] Apr 17 15:25:38.251731 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.251705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.254004 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.253980 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 15:25:38.254144 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.254015 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 15:25:38.254144 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.254020 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 15:25:38.254770 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.254751 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 15:25:38.254867 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.254813 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7w66g\"" Apr 17 15:25:38.258735 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.258712 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx"] Apr 17 15:25:38.414538 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.414509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e62591-44d1-4696-b12c-611c75644a9a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.414662 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.414559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmf9p\" (UniqueName: \"kubernetes.io/projected/93e62591-44d1-4696-b12c-611c75644a9a-kube-api-access-cmf9p\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.414662 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.414581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93e62591-44d1-4696-b12c-611c75644a9a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.515918 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.515848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e62591-44d1-4696-b12c-611c75644a9a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.516062 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.515913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmf9p\" (UniqueName: \"kubernetes.io/projected/93e62591-44d1-4696-b12c-611c75644a9a-kube-api-access-cmf9p\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.516062 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.515943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93e62591-44d1-4696-b12c-611c75644a9a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.516556 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.516532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93e62591-44d1-4696-b12c-611c75644a9a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.518370 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.518343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93e62591-44d1-4696-b12c-611c75644a9a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.523150 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.523126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmf9p\" (UniqueName: \"kubernetes.io/projected/93e62591-44d1-4696-b12c-611c75644a9a-kube-api-access-cmf9p\") pod \"kuadrant-console-plugin-6cb54b5c86-qznsx\" (UID: \"93e62591-44d1-4696-b12c-611c75644a9a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.576156 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.576134 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" Apr 17 15:25:38.700328 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.700296 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx"] Apr 17 15:25:38.704281 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:25:38.704254 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93e62591_44d1_4696_b12c_611c75644a9a.slice/crio-e18ad69471e1eace44aa9ebce586a2cbc44b1fda683b0dbf8edeae1ae1c06a19 WatchSource:0}: Error finding container e18ad69471e1eace44aa9ebce586a2cbc44b1fda683b0dbf8edeae1ae1c06a19: Status 404 returned error can't find the container with id e18ad69471e1eace44aa9ebce586a2cbc44b1fda683b0dbf8edeae1ae1c06a19 Apr 17 15:25:38.868755 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:25:38.868726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" event={"ID":"93e62591-44d1-4696-b12c-611c75644a9a","Type":"ContainerStarted","Data":"e18ad69471e1eace44aa9ebce586a2cbc44b1fda683b0dbf8edeae1ae1c06a19"} Apr 17 15:26:02.962053 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:02.962003 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" event={"ID":"93e62591-44d1-4696-b12c-611c75644a9a","Type":"ContainerStarted","Data":"8b18501d9df4d4b2b95959ab94ddbdd244bc7bb52ffcf8c9b63445ac592e3119"} Apr 17 15:26:02.977826 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:02.977766 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qznsx" podStartSLOduration=1.335807395 podStartE2EDuration="24.97775129s" podCreationTimestamp="2026-04-17 15:25:38 +0000 UTC" firstStartedPulling="2026-04-17 15:25:38.705702697 +0000 UTC m=+503.247367541" lastFinishedPulling="2026-04-17 15:26:02.347646587 +0000 UTC m=+526.889311436" observedRunningTime="2026-04-17 15:26:02.976077403 +0000 UTC m=+527.517742270" watchObservedRunningTime="2026-04-17 15:26:02.97775129 +0000 UTC m=+527.519416224" Apr 17 15:26:05.298350 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.298273 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5"] Apr 17 15:26:05.304462 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.304443 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.306705 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.306687 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6gthh\"" Apr 17 15:26:05.313781 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.313755 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5"] Apr 17 15:26:05.450411 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.450380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4kv\" (UniqueName: \"kubernetes.io/projected/76f395a0-f9ed-43f9-b58e-d94038ded123-kube-api-access-9t4kv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-66mn5\" (UID: \"76f395a0-f9ed-43f9-b58e-d94038ded123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.450545 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.450415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/76f395a0-f9ed-43f9-b58e-d94038ded123-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-66mn5\" (UID: \"76f395a0-f9ed-43f9-b58e-d94038ded123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.551809 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.551749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4kv\" (UniqueName: \"kubernetes.io/projected/76f395a0-f9ed-43f9-b58e-d94038ded123-kube-api-access-9t4kv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-66mn5\" (UID: \"76f395a0-f9ed-43f9-b58e-d94038ded123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.551809 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.551791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/76f395a0-f9ed-43f9-b58e-d94038ded123-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-66mn5\" (UID: \"76f395a0-f9ed-43f9-b58e-d94038ded123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.552174 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.552155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/76f395a0-f9ed-43f9-b58e-d94038ded123-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-66mn5\" (UID: \"76f395a0-f9ed-43f9-b58e-d94038ded123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.562172 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.562151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4kv\" (UniqueName: \"kubernetes.io/projected/76f395a0-f9ed-43f9-b58e-d94038ded123-kube-api-access-9t4kv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-66mn5\" (UID: \"76f395a0-f9ed-43f9-b58e-d94038ded123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.615097 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.615072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:05.744228 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.744203 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5"] Apr 17 15:26:05.746683 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:26:05.746658 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f395a0_f9ed_43f9_b58e_d94038ded123.slice/crio-7278a093789ce79fca0e54a6ae27f11589f242b9a3ec3ddef72329d6e70cc2b2 WatchSource:0}: Error finding container 7278a093789ce79fca0e54a6ae27f11589f242b9a3ec3ddef72329d6e70cc2b2: Status 404 returned error can't find the container with id 7278a093789ce79fca0e54a6ae27f11589f242b9a3ec3ddef72329d6e70cc2b2 Apr 17 15:26:05.973995 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:05.973967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" event={"ID":"76f395a0-f9ed-43f9-b58e-d94038ded123","Type":"ContainerStarted","Data":"7278a093789ce79fca0e54a6ae27f11589f242b9a3ec3ddef72329d6e70cc2b2"} Apr 17 15:26:11.998095 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:11.997990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" event={"ID":"76f395a0-f9ed-43f9-b58e-d94038ded123","Type":"ContainerStarted","Data":"9782976734d6d6b4ea801b1f1f44d6f728bc1d0ae1b94c12a922626f36f13fc2"} Apr 17 15:26:11.998464 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:11.998175 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:26:12.016391 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:12.016350 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" podStartSLOduration=1.039729683 podStartE2EDuration="7.016334357s" podCreationTimestamp="2026-04-17 15:26:05 +0000 UTC" firstStartedPulling="2026-04-17 15:26:05.749099606 +0000 UTC m=+530.290764451" lastFinishedPulling="2026-04-17 15:26:11.725704277 +0000 UTC m=+536.267369125" observedRunningTime="2026-04-17 15:26:12.015058615 +0000 UTC m=+536.556723482" watchObservedRunningTime="2026-04-17 15:26:12.016334357 +0000 UTC m=+536.557999224" Apr 17 15:26:23.003190 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:26:23.003154 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-66mn5" Apr 17 15:27:15.963250 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:27:15.963221 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:27:15.964454 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:27:15.964434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:27:15.969500 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:27:15.969479 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:27:15.970692 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:27:15.970676 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:30:00.146214 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.146179 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607330-lwzfr"] Apr 17 15:30:00.149473 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.149457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" Apr 17 15:30:00.151861 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.151839 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-g4s5v\"" Apr 17 15:30:00.157366 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.157345 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607330-lwzfr"] Apr 17 15:30:00.232974 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.232941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trp6l\" (UniqueName: \"kubernetes.io/projected/e6de62e6-81ea-4877-91fc-a85249220613-kube-api-access-trp6l\") pod \"maas-api-key-cleanup-29607330-lwzfr\" (UID: \"e6de62e6-81ea-4877-91fc-a85249220613\") " pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" Apr 17 15:30:00.333342 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.333309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trp6l\" (UniqueName: \"kubernetes.io/projected/e6de62e6-81ea-4877-91fc-a85249220613-kube-api-access-trp6l\") pod \"maas-api-key-cleanup-29607330-lwzfr\" (UID: \"e6de62e6-81ea-4877-91fc-a85249220613\") " pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" Apr 17 15:30:00.340797 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.340779 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trp6l\" (UniqueName: \"kubernetes.io/projected/e6de62e6-81ea-4877-91fc-a85249220613-kube-api-access-trp6l\") pod \"maas-api-key-cleanup-29607330-lwzfr\" (UID: \"e6de62e6-81ea-4877-91fc-a85249220613\") " pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" Apr 17 15:30:00.460381 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.460323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" Apr 17 15:30:00.785004 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.784928 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607330-lwzfr"] Apr 17 15:30:00.788117 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:30:00.788089 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6de62e6_81ea_4877_91fc_a85249220613.slice/crio-7344cb09ef002b867066f86095997b8e8c279be0ce0c396851e139c293a93dd6 WatchSource:0}: Error finding container 7344cb09ef002b867066f86095997b8e8c279be0ce0c396851e139c293a93dd6: Status 404 returned error can't find the container with id 7344cb09ef002b867066f86095997b8e8c279be0ce0c396851e139c293a93dd6 Apr 17 15:30:00.789762 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:00.789744 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:30:01.773360 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:01.773317 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerStarted","Data":"7344cb09ef002b867066f86095997b8e8c279be0ce0c396851e139c293a93dd6"} Apr 17 15:30:03.781158 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:03.781127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerStarted","Data":"386c48f59c0cda4565d31039c7c7d9c80fa286e770e3a33f026fa569bc98a42d"} Apr 17 15:30:03.795831 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:03.795792 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" podStartSLOduration=1.542357935 podStartE2EDuration="3.795779248s" podCreationTimestamp="2026-04-17 15:30:00 +0000 UTC" firstStartedPulling="2026-04-17 15:30:00.78993368 +0000 UTC m=+765.331598540" lastFinishedPulling="2026-04-17 15:30:03.043355007 +0000 UTC m=+767.585019853" observedRunningTime="2026-04-17 15:30:03.79426698 +0000 UTC m=+768.335931847" watchObservedRunningTime="2026-04-17 15:30:03.795779248 +0000 UTC m=+768.337444160" Apr 17 15:30:23.855647 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:23.855617 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6de62e6-81ea-4877-91fc-a85249220613" containerID="386c48f59c0cda4565d31039c7c7d9c80fa286e770e3a33f026fa569bc98a42d" exitCode=6 Apr 17 15:30:23.855937 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:23.855689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerDied","Data":"386c48f59c0cda4565d31039c7c7d9c80fa286e770e3a33f026fa569bc98a42d"} Apr 17 15:30:23.855984 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:23.855977 2577 scope.go:117] "RemoveContainer" containerID="386c48f59c0cda4565d31039c7c7d9c80fa286e770e3a33f026fa569bc98a42d" Apr 17 15:30:24.860847 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:24.860820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerStarted","Data":"2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2"} Apr 17 15:30:44.936777 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:44.936744 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6de62e6-81ea-4877-91fc-a85249220613" containerID="2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2" exitCode=6 Apr 17 15:30:44.937265 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:44.936825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerDied","Data":"2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2"} Apr 17 15:30:44.937265 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:44.936881 2577 scope.go:117] "RemoveContainer" containerID="386c48f59c0cda4565d31039c7c7d9c80fa286e770e3a33f026fa569bc98a42d" Apr 17 15:30:44.937265 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:44.937259 2577 scope.go:117] "RemoveContainer" containerID="2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2" Apr 17 15:30:44.937458 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:30:44.937437 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607330-lwzfr_opendatahub(e6de62e6-81ea-4877-91fc-a85249220613)\"" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" podUID="e6de62e6-81ea-4877-91fc-a85249220613" Apr 17 15:30:58.057478 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:58.057447 2577 scope.go:117] "RemoveContainer" containerID="2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2" Apr 17 15:30:58.986637 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:58.986605 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerStarted","Data":"bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306"} Apr 17 15:30:59.079686 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:59.079658 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607330-lwzfr"] Apr 17 15:30:59.990294 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:30:59.990256 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" containerID="cri-o://bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306" gracePeriod=30 Apr 17 15:31:18.934055 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:18.934013 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" Apr 17 15:31:19.011721 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.011654 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trp6l\" (UniqueName: \"kubernetes.io/projected/e6de62e6-81ea-4877-91fc-a85249220613-kube-api-access-trp6l\") pod \"e6de62e6-81ea-4877-91fc-a85249220613\" (UID: \"e6de62e6-81ea-4877-91fc-a85249220613\") " Apr 17 15:31:19.013889 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.013861 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6de62e6-81ea-4877-91fc-a85249220613-kube-api-access-trp6l" (OuterVolumeSpecName: "kube-api-access-trp6l") pod "e6de62e6-81ea-4877-91fc-a85249220613" (UID: "e6de62e6-81ea-4877-91fc-a85249220613"). InnerVolumeSpecName "kube-api-access-trp6l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:31:19.057741 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.057703 2577 generic.go:358] "Generic (PLEG): container finished" podID="e6de62e6-81ea-4877-91fc-a85249220613" containerID="bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306" exitCode=6 Apr 17 15:31:19.057840 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.057789 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" Apr 17 15:31:19.057840 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.057790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerDied","Data":"bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306"} Apr 17 15:31:19.057912 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.057830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607330-lwzfr" event={"ID":"e6de62e6-81ea-4877-91fc-a85249220613","Type":"ContainerDied","Data":"7344cb09ef002b867066f86095997b8e8c279be0ce0c396851e139c293a93dd6"} Apr 17 15:31:19.057912 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.057861 2577 scope.go:117] "RemoveContainer" containerID="bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306" Apr 17 15:31:19.068842 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.068823 2577 scope.go:117] "RemoveContainer" containerID="2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2" Apr 17 15:31:19.075992 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.075976 2577 scope.go:117] "RemoveContainer" containerID="bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306" Apr 17 15:31:19.076307 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:31:19.076285 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306\": container with ID starting with bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306 not found: ID does not exist" containerID="bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306" Apr 17 15:31:19.076374 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.076315 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306"} err="failed to get container status \"bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306\": rpc error: code = NotFound desc = could not find container \"bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306\": container with ID starting with bc1942be7bec7f0f1c4d22c6f057ee64b8e55c0f4e3637ab0920ec35b9d1a306 not found: ID does not exist" Apr 17 15:31:19.076374 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.076331 2577 scope.go:117] "RemoveContainer" containerID="2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2" Apr 17 15:31:19.076576 ip-10-0-131-29 kubenswrapper[2577]: E0417 15:31:19.076558 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2\": container with ID starting with 2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2 not found: ID does not exist" containerID="2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2" Apr 17 15:31:19.076621 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.076582 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2"} err="failed to get container status \"2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2\": rpc error: code = NotFound desc = could not find container \"2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2\": container with ID starting with 2237fc910257a1e6e7c7fde045b6688c3319404cdc3bae057ed9bb5007d911f2 not found: ID does not exist" Apr 17 15:31:19.081538 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.081520 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607330-lwzfr"] Apr 17 15:31:19.084972 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.084948 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607330-lwzfr"] Apr 17 15:31:19.112631 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:19.112607 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trp6l\" (UniqueName: \"kubernetes.io/projected/e6de62e6-81ea-4877-91fc-a85249220613-kube-api-access-trp6l\") on node \"ip-10-0-131-29.ec2.internal\" DevicePath \"\"" Apr 17 15:31:20.063413 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:31:20.063376 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6de62e6-81ea-4877-91fc-a85249220613" path="/var/lib/kubelet/pods/e6de62e6-81ea-4877-91fc-a85249220613/volumes" Apr 17 15:32:15.990427 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:15.990402 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:32:15.990872 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:15.990853 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:32:15.996251 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:15.996232 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:32:15.996385 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:15.996368 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:32:54.240335 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:54.240301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9bd7bdf77-c6gpg_79a0d695-e84d-4423-ba3a-7376aa132e46/manager/0.log" Apr 17 15:32:56.138462 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:56.138433 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-qznsx_93e62591-44d1-4696-b12c-611c75644a9a/kuadrant-console-plugin/0.log" Apr 17 15:32:56.380925 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:56.380891 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-66mn5_76f395a0-f9ed-43f9-b58e-d94038ded123/manager/0.log" Apr 17 15:32:57.184652 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:32:57.184629 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-546dd5d8dc-2vbx8_dfd94ffa-c815-4fae-b34b-0e99fc19f2b8/kube-auth-proxy/0.log" Apr 17 15:33:02.748330 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748296 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-79652/must-gather-9zdm8"] Apr 17 15:33:02.748690 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748654 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748690 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748667 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748690 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748688 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748690 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748693 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748699 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748705 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748759 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748768 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.748815 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.748774 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6de62e6-81ea-4877-91fc-a85249220613" containerName="cleanup" Apr 17 15:33:02.751578 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.751564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:02.756293 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.756270 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-79652\"/\"default-dockercfg-zpffh\"" Apr 17 15:33:02.756422 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.756308 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-79652\"/\"openshift-service-ca.crt\"" Apr 17 15:33:02.756422 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.756408 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-79652\"/\"kube-root-ca.crt\"" Apr 17 15:33:02.776936 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.776914 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/must-gather-9zdm8"] Apr 17 15:33:02.875434 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.875411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jdv\" (UniqueName: \"kubernetes.io/projected/9ff14821-49b2-484c-bf45-0a4430b20ca4-kube-api-access-87jdv\") pod \"must-gather-9zdm8\" (UID: \"9ff14821-49b2-484c-bf45-0a4430b20ca4\") " pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:02.875434 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.875444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ff14821-49b2-484c-bf45-0a4430b20ca4-must-gather-output\") pod \"must-gather-9zdm8\" (UID: \"9ff14821-49b2-484c-bf45-0a4430b20ca4\") " pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:02.976458 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.976434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ff14821-49b2-484c-bf45-0a4430b20ca4-must-gather-output\") pod \"must-gather-9zdm8\" (UID: \"9ff14821-49b2-484c-bf45-0a4430b20ca4\") " pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:02.976565 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.976524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87jdv\" (UniqueName: \"kubernetes.io/projected/9ff14821-49b2-484c-bf45-0a4430b20ca4-kube-api-access-87jdv\") pod \"must-gather-9zdm8\" (UID: \"9ff14821-49b2-484c-bf45-0a4430b20ca4\") " pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:02.976816 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.976799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ff14821-49b2-484c-bf45-0a4430b20ca4-must-gather-output\") pod \"must-gather-9zdm8\" (UID: \"9ff14821-49b2-484c-bf45-0a4430b20ca4\") " pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:02.986604 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:02.986581 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jdv\" (UniqueName: \"kubernetes.io/projected/9ff14821-49b2-484c-bf45-0a4430b20ca4-kube-api-access-87jdv\") pod \"must-gather-9zdm8\" (UID: \"9ff14821-49b2-484c-bf45-0a4430b20ca4\") " pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:03.060905 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:03.060856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/must-gather-9zdm8" Apr 17 15:33:03.187221 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:03.187197 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/must-gather-9zdm8"] Apr 17 15:33:03.189734 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:33:03.189698 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff14821_49b2_484c_bf45_0a4430b20ca4.slice/crio-56e6e293c239a679f00a2e8fda1c113f5c0ab3421ba5cb285818104a3b353074 WatchSource:0}: Error finding container 56e6e293c239a679f00a2e8fda1c113f5c0ab3421ba5cb285818104a3b353074: Status 404 returned error can't find the container with id 56e6e293c239a679f00a2e8fda1c113f5c0ab3421ba5cb285818104a3b353074 Apr 17 15:33:03.391504 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:03.391475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/must-gather-9zdm8" event={"ID":"9ff14821-49b2-484c-bf45-0a4430b20ca4","Type":"ContainerStarted","Data":"56e6e293c239a679f00a2e8fda1c113f5c0ab3421ba5cb285818104a3b353074"} Apr 17 15:33:04.400707 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:04.400587 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/must-gather-9zdm8" event={"ID":"9ff14821-49b2-484c-bf45-0a4430b20ca4","Type":"ContainerStarted","Data":"917b9d8a0cfe3b6cdc864bed317307ad6bf9d5637d3f6ebcf23c1ee4922222b7"} Apr 17 15:33:04.400707 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:04.400643 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/must-gather-9zdm8" event={"ID":"9ff14821-49b2-484c-bf45-0a4430b20ca4","Type":"ContainerStarted","Data":"e3b9a50553862ad170c4695c3162b15110054c8a153d53593232cd018452fd21"} Apr 17 15:33:04.417357 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:04.417207 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-79652/must-gather-9zdm8" podStartSLOduration=1.527616191 podStartE2EDuration="2.417189127s" podCreationTimestamp="2026-04-17 15:33:02 +0000 UTC" firstStartedPulling="2026-04-17 15:33:03.191384725 +0000 UTC m=+947.733049569" lastFinishedPulling="2026-04-17 15:33:04.080957659 +0000 UTC m=+948.622622505" observedRunningTime="2026-04-17 15:33:04.415813754 +0000 UTC m=+948.957478622" watchObservedRunningTime="2026-04-17 15:33:04.417189127 +0000 UTC m=+948.958853995" Apr 17 15:33:05.705130 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:05.705096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-blrs8_7db05466-6c79-496c-9e75-143b8a1a69d1/global-pull-secret-syncer/0.log" Apr 17 15:33:05.817071 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:05.817001 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ljhqf_131fc208-4c4f-4581-b543-a9f317f71657/konnectivity-agent/0.log" Apr 17 15:33:05.891269 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:05.891234 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-29.ec2.internal_1e1f1a3ec57b3b66a90c747576fdf8e1/haproxy/0.log" Apr 17 15:33:10.073152 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:10.073123 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-qznsx_93e62591-44d1-4696-b12c-611c75644a9a/kuadrant-console-plugin/0.log" Apr 17 15:33:10.159103 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:10.159068 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-66mn5_76f395a0-f9ed-43f9-b58e-d94038ded123/manager/0.log" Apr 17 15:33:11.851965 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:11.851940 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjkbk_c4b76af8-7aae-4de0-be95-221109f82fb9/kube-state-metrics/0.log" Apr 17 15:33:11.871548 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:11.871513 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjkbk_c4b76af8-7aae-4de0-be95-221109f82fb9/kube-rbac-proxy-main/0.log" Apr 17 15:33:11.894446 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:11.894424 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjkbk_c4b76af8-7aae-4de0-be95-221109f82fb9/kube-rbac-proxy-self/0.log" Apr 17 15:33:11.922488 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:11.922461 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7f6d6b796-8d99r_eaafa6a6-9e9c-4379-ad58-cfaf54d8ccc8/metrics-server/0.log" Apr 17 15:33:12.142873 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.142793 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vb2bb_0210dbd3-bf15-4240-a299-f959cf307c04/node-exporter/0.log" Apr 17 15:33:12.164245 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.164214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vb2bb_0210dbd3-bf15-4240-a299-f959cf307c04/kube-rbac-proxy/0.log" Apr 17 15:33:12.185480 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.185456 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vb2bb_0210dbd3-bf15-4240-a299-f959cf307c04/init-textfile/0.log" Apr 17 15:33:12.287963 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.287921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8258b17e-3459-46de-aaac-5f443f4b05c0/prometheus/0.log" Apr 17 15:33:12.307996 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.307966 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8258b17e-3459-46de-aaac-5f443f4b05c0/config-reloader/0.log" Apr 17 15:33:12.342066 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.342021 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8258b17e-3459-46de-aaac-5f443f4b05c0/thanos-sidecar/0.log" Apr 17 15:33:12.372413 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.372381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8258b17e-3459-46de-aaac-5f443f4b05c0/kube-rbac-proxy-web/0.log" Apr 17 15:33:12.409030 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.408947 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8258b17e-3459-46de-aaac-5f443f4b05c0/kube-rbac-proxy/0.log" Apr 17 15:33:12.439840 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.439805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8258b17e-3459-46de-aaac-5f443f4b05c0/kube-rbac-proxy-thanos/0.log" Apr 17 15:33:12.466001 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.465965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8258b17e-3459-46de-aaac-5f443f4b05c0/init-config-reloader/0.log" Apr 17 15:33:12.534324 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.534294 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-t5wmn_93545302-4bda-45c5-9cb7-2e69c294a279/prometheus-operator-admission-webhook/0.log" Apr 17 15:33:12.570349 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.570313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85f567b6d8-trkdw_f29adb65-f616-46d7-9d7e-90a6185b6533/telemeter-client/0.log" Apr 17 15:33:12.590208 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.590179 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85f567b6d8-trkdw_f29adb65-f616-46d7-9d7e-90a6185b6533/reload/0.log" Apr 17 15:33:12.615588 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.615532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85f567b6d8-trkdw_f29adb65-f616-46d7-9d7e-90a6185b6533/kube-rbac-proxy/0.log" Apr 17 15:33:12.662295 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.662214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/thanos-query/0.log" Apr 17 15:33:12.722383 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.722357 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy-web/0.log" Apr 17 15:33:12.775367 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.775337 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy/0.log" Apr 17 15:33:12.796357 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.796329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/prom-label-proxy/0.log" Apr 17 15:33:12.819681 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.819652 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy-rules/0.log" Apr 17 15:33:12.853004 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:12.852954 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69676d6c77-k8jwr_528a42f2-ecbc-4039-ab6e-92181c86c155/kube-rbac-proxy-metrics/0.log" Apr 17 15:33:14.298688 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.298653 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t"] Apr 17 15:33:14.303568 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.303545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.308440 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.308410 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t"] Apr 17 15:33:14.402241 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.402205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-podres\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.402439 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.402253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-lib-modules\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.402439 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.402303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-proc\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.402439 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.402351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-sys\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.402439 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.402376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7s9p\" (UniqueName: \"kubernetes.io/projected/48c05cec-724f-4bac-8255-ae31c02136d9-kube-api-access-q7s9p\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.439307 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.439279 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/2.log" Apr 17 15:33:14.444322 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.444298 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdxgz_85072088-8af2-4219-80f7-6a18460c13cf/console-operator/3.log" Apr 17 15:33:14.503175 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-proc\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503360 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-proc\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503360 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-sys\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503360 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503336 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-sys\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503360 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7s9p\" (UniqueName: \"kubernetes.io/projected/48c05cec-724f-4bac-8255-ae31c02136d9-kube-api-access-q7s9p\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503567 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-podres\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503567 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-lib-modules\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-podres\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.503687 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.503664 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48c05cec-724f-4bac-8255-ae31c02136d9-lib-modules\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.511712 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.511682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7s9p\" (UniqueName: \"kubernetes.io/projected/48c05cec-724f-4bac-8255-ae31c02136d9-kube-api-access-q7s9p\") pod \"perf-node-gather-daemonset-kdj6t\" (UID: \"48c05cec-724f-4bac-8255-ae31c02136d9\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.616505 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.616468 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:14.769297 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:14.769265 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t"] Apr 17 15:33:14.773183 ip-10-0-131-29 kubenswrapper[2577]: W0417 15:33:14.773155 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod48c05cec_724f_4bac_8255_ae31c02136d9.slice/crio-2e3fbae8757f44b111aed49392c4d7e0f76fd54914c470c131273d42a0f935e3 WatchSource:0}: Error finding container 2e3fbae8757f44b111aed49392c4d7e0f76fd54914c470c131273d42a0f935e3: Status 404 returned error can't find the container with id 2e3fbae8757f44b111aed49392c4d7e0f76fd54914c470c131273d42a0f935e3 Apr 17 15:33:15.453618 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:15.452918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" event={"ID":"48c05cec-724f-4bac-8255-ae31c02136d9","Type":"ContainerStarted","Data":"845892f37c64a9c34db9bd93abbdc1375075f37d304266f8bd7e30e22ef0911f"} Apr 17 15:33:15.453618 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:15.452966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" event={"ID":"48c05cec-724f-4bac-8255-ae31c02136d9","Type":"ContainerStarted","Data":"2e3fbae8757f44b111aed49392c4d7e0f76fd54914c470c131273d42a0f935e3"} Apr 17 15:33:15.454201 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:15.453644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:15.472003 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:15.471328 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" podStartSLOduration=1.471307834 podStartE2EDuration="1.471307834s" podCreationTimestamp="2026-04-17 15:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:33:15.468929715 +0000 UTC m=+960.010594574" watchObservedRunningTime="2026-04-17 15:33:15.471307834 +0000 UTC m=+960.012972702" Apr 17 15:33:16.146387 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:16.146359 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2vpk9_4f1d2ee5-9f9b-4086-afee-0e043df76f02/dns/0.log" Apr 17 15:33:16.179944 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:16.179916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2vpk9_4f1d2ee5-9f9b-4086-afee-0e043df76f02/kube-rbac-proxy/0.log" Apr 17 15:33:16.303609 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:16.303580 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8hx9p_14c11c01-24f2-4908-a3cb-5c90f9ec8d35/dns-node-resolver/0.log" Apr 17 15:33:16.789402 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:16.789373 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-664f745c79-lz8j5_3b884775-56c3-4983-a77d-ac6ef0438ae4/registry/0.log" Apr 17 15:33:16.831111 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:16.831085 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cx2v6_879a1087-ad81-4931-a7fc-1d30c4f2539d/node-ca/0.log" Apr 17 15:33:17.779575 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:17.779547 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-546dd5d8dc-2vbx8_dfd94ffa-c815-4fae-b34b-0e99fc19f2b8/kube-auth-proxy/0.log" Apr 17 15:33:18.332986 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:18.332955 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dl2zj_05352b16-4fb2-4f4e-894f-d69b17f92924/serve-healthcheck-canary/0.log" Apr 17 15:33:18.772779 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:18.772712 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89cc9_19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4/kube-rbac-proxy/0.log" Apr 17 15:33:18.792087 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:18.792066 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89cc9_19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4/exporter/0.log" Apr 17 15:33:18.810925 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:18.810901 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89cc9_19f0cf6c-7fd6-48bc-b7f5-6fbd50fd92e4/extractor/0.log" Apr 17 15:33:20.839939 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:20.839903 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9bd7bdf77-c6gpg_79a0d695-e84d-4423-ba3a-7376aa132e46/manager/0.log" Apr 17 15:33:21.984428 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:21.984398 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59bc47b496-cdqvz_613a897c-2655-4e38-a210-08757b6ac34a/manager/0.log" Apr 17 15:33:22.471069 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:22.471026 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kdj6t" Apr 17 15:33:27.547622 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.547546 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7jdw8_14d69252-8d9f-46ec-8e05-0b0a8f1b3b07/kube-multus-additional-cni-plugins/0.log" Apr 17 15:33:27.566820 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.566793 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7jdw8_14d69252-8d9f-46ec-8e05-0b0a8f1b3b07/egress-router-binary-copy/0.log" Apr 17 15:33:27.584906 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.584888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7jdw8_14d69252-8d9f-46ec-8e05-0b0a8f1b3b07/cni-plugins/0.log" Apr 17 15:33:27.604477 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.604461 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7jdw8_14d69252-8d9f-46ec-8e05-0b0a8f1b3b07/bond-cni-plugin/0.log" Apr 17 15:33:27.622848 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.622831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7jdw8_14d69252-8d9f-46ec-8e05-0b0a8f1b3b07/routeoverride-cni/0.log" Apr 17 15:33:27.641456 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.641425 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7jdw8_14d69252-8d9f-46ec-8e05-0b0a8f1b3b07/whereabouts-cni-bincopy/0.log" Apr 17 15:33:27.659425 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.659408 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7jdw8_14d69252-8d9f-46ec-8e05-0b0a8f1b3b07/whereabouts-cni/0.log" Apr 17 15:33:27.982765 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:27.982738 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j8tkm_859c9d28-c95a-461d-841e-f476f3fb6fb7/kube-multus/0.log" Apr 17 15:33:28.092878 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:28.092850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j7zl6_4445020e-d73c-4a2d-9f40-1c3fc286490e/network-metrics-daemon/0.log" Apr 17 15:33:28.112903 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:28.112878 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j7zl6_4445020e-d73c-4a2d-9f40-1c3fc286490e/kube-rbac-proxy/0.log" Apr 17 15:33:28.978696 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:28.978668 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-controller/0.log" Apr 17 15:33:28.995480 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:28.995455 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/0.log" Apr 17 15:33:29.000160 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:29.000139 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovn-acl-logging/1.log" Apr 17 15:33:29.019479 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:29.019453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/kube-rbac-proxy-node/0.log" Apr 17 15:33:29.037357 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:29.037340 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 15:33:29.053945 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:29.053928 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/northd/0.log" Apr 17 15:33:29.073654 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:29.073635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/nbdb/0.log" Apr 17 15:33:29.092259 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:29.092238 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/sbdb/0.log" Apr 17 15:33:29.205303 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:29.205275 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6m42k_41a2e9d0-bfbe-47d5-9ccd-610cb5204675/ovnkube-controller/0.log" Apr 17 15:33:30.802206 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:30.802179 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-k9sfm_8f30a8c1-574b-4685-88d7-2b714bdf287f/check-endpoints/0.log" Apr 17 15:33:30.867283 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:30.867246 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xz6xx_2f57efa0-9b15-4e70-9d38-74a517201d53/network-check-target-container/0.log" Apr 17 15:33:31.913651 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:31.913618 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xc7q5_a55a9866-dc90-424d-aefb-be85c6ce02cb/iptables-alerter/0.log" Apr 17 15:33:32.623098 ip-10-0-131-29 kubenswrapper[2577]: I0417 15:33:32.623064 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9v7gz_7aa4aa82-d5f1-423a-b9ac-13669e2b1804/tuned/0.log"